7 Assessment Results
Chapter 7 of the Dynamic Learning Maps® (DLM®) Alternate Assessment System 2014–2015 Technical Manual—Year-End Model (Dynamic Learning Maps Consortium, 2016) describes assessment results for the 2014–2015 academic year, including student participation and performance summaries, and an overview of data files and score reports delivered to state education agencies. Technical Manual updates provide a description of data files, score reports, and results for each corresponding academic year.
This chapter presents 2020–2021 student participation data; the percentage of students achieving at each performance level; and subgroup performance by gender, race, ethnicity, and English learner status. This chapter also reports the distribution of students by the highest linkage level mastered during 2020–2021. Finally, this chapter describes updates made to score reports during the 2020–2021 operational year. For a complete description of score reports and interpretive guides, see Chapter 7 of the 2014–2015 Technical Manual—Year-End Model (Dynamic Learning Maps Consortium, 2016).
In this chapter we describe the results that were reported as part of the 2020–2021 assessment administration. However, due to the confounding factors of assessment administration changes and COVID-19, these results should be interpreted with caution and should not be directly compared to previous assessment administrations.
7.1 Impacts to Assessment Administration
There were multiple factors that potentially impacted assessment administration and performance during the 2020–2021 school year. First were changes that were originally implemented in the 2019–2020 school year, which first took effect in 2020–2021 due to the cancellation of testing in spring 2020. This included the adoption of a revised blueprint for both ELA and mathematics, which resulted in a reduction in the number of Essential Elements (EEs) on which students are assessed (as described in Chapter 3 of this manual). The blueprint revision was accompanied by a change in the operational item pool from testlets that measure multiple EEs to testlets that each measure only one EE, allowing each EE to be measured by more items. The blueprint revision also required an adjustment to the cut points used to determine the overall performance level in each subject, as described in Chapter 6 of this manual.
In addition, the 2020–2021 school year was significantly impacted by COVID-19. Overall, participation in DLM assessments across all states was lower than what would typically be expected. This decrease was not uniform across demographic subgroups. White students made up a larger percentage of the student population in 2020–2021 than in prior years, whereas African American students, students of Hispanic ethnicity, and English learners made up a smaller percentage of the student population. There were also fewer students who were placed in the Foundational and Band 3 complexity bands, which are used to determine the starting linkage level in each subject (see Chapter 4 of this manual for a description of linkage level assignment). Further, data from the spring teacher survey indicated that students may have had less opportunity to learn, and that many students experienced difficulty with remote learning.
For a complete discussion of student performance and the potential impacts of assessment administration changes and COVID-19, see (Accessible Teaching, Learning, and Assessment Systems, 2021).
7.2 Student Participation
During spring 2021, assessments were administered to 57,611 students in 14 states Two states chose to extend their testing window through September 2021. For these states, results are included for students who completed their assessments by the close of the standard consortium testing window on July 2, 2021.. Counts of students tested in each state are displayed in Table 7.1. The assessments were administered by 18,358 educators in 10,124 schools and 3,551 school districts.
Table 7.2 summarizes the number of students assessed in each grade. In grades 3 through 8, over 7,600 students participated in each grade. In high school, the largest number of students participated in grade 11, and the smallest number participated in grade 12. The differences in high school grade-level participation can be traced to differing state-level policies about the grade(s) in which students are assessed.
Table 7.3 summarizes the demographic characteristics of the students who participated in the spring 2021 administration. The majority of participants were male (68%) and white (65%). About 5% of students were monitored or eligible for English learner services.
|Two or more races||6,662||11.6|
|Native Hawaiian or Pacific Islander||254||0.4|
|English learner participation|
|Not EL eligible or monitored||54,849||95.2|
|EL eligible or monitored||2,762||4.8|
In addition to the spring administration, instructionally embedded assessments are also made available for teachers to administer to students during the year. Results from these assessments do not contribute to final summative scoring but can be used to guide instructional decision-making. Table 7.4 summarizes the number of students participating in instructionally embedded testing by state. A total of 242 students, 11 states, took at least one instructionally embedded testlet during the 2020–2021 academic year.
Table 7.5 summarizes the number of instructionally embedded testlets taken in ELA and mathematics. Across 11 states, students took 2,234 ELA testlets and 2,244 mathematics testlets.
|Grade||English language arts||Mathematics|
7.3 Student Performance
Student performance on DLM assessments is interpreted using cut points, determined during standard setting, which categorize student results into four performance levels. For a full description of the standard-setting process, see Chapter 6 of the 2014–2015 Technical Manual—Year-End Model (Dynamic Learning Maps Consortium, 2016). Following changes to the assessment blueprint in 2019–2020, a standards adjustment process was used to update the cut points in 2020–2021. For a description of the standards adjustment process, see Chapter 6 of this manual. A student’s performance level is determined based on the total number of linkage levels mastered across the assessed Essential Elements (EEs).
For the spring 2021 administration, student performance was reported using the same four performance levels approved by the DLM Consortium for prior years:
- The student demonstrates Emerging understanding of and ability to apply content knowledge and skills represented by the EEs.
- The student’s understanding of and ability to apply targeted content knowledge and skills represented by the EEs is Approaching the Target.
- The student’s understanding of and ability to apply content knowledge and skills represented by the EEs is At Target. This performance level is considered to be meeting achievement expectations.
- The student demonstrates Advanced understanding of and ability to apply targeted content knowledge and skills represented by the EEs.
7.3.1 Overall Performance
Table 7.6 reports the percentage of students achieving at each performance level from the spring 2021 administration for ELA and mathematics. For ELA, the percentage of students who achieved at the At Target or Advanced levels ranged from approximately 23% to 35%. In mathematics, the percentage of students meeting or exceeding Target expectations ranged from approximately 10% to 43%.
|Grade||Emerging (%)||Approaching (%)||Target (%)||Advanced (%)||Target+ Advanced (%)|
|English language arts|
|3 (n = 7,600)||58.7||10.7||28.2||2.4||30.7|
|4 (n = 7,595)||59.3||18.2||20.3||2.3||22.5|
|5 (n = 7,677)||53.5||11.8||30.1||4.6||34.7|
|6 (n = 7,728)||43.6||22.4||25.3||8.8||34.1|
|7 (n = 7,839)||43.8||25.6||24.6||6.0||30.6|
|8 (n = 7,646)||41.9||31.4||26.3||0.4||26.7|
|9 (n = 3,107)||33.9||34.1||26.8||5.2||32.0|
|10 (n = 1,224)||29.5||42.7||27.1||0.7||27.8|
|11 (n = 5,629)||32.8||33.8||27.6||5.7||33.3|
|12 (n = 542)||33.6||34.7||27.3||4.4||31.7|
|3 (n = 7,356)||62.3||14.9||14.1||8.7||22.8|
|4 (n = 7,767)||47.7||9.5||22.9||19.9||42.8|
|5 (n = 7,374)||42.7||27.6||17.0||12.7||29.7|
|6 (n = 8,012)||56.6||22.6||11.9||8.9||20.8|
|7 (n = 7,615)||63.1||19.9||11.9||5.1||17.0|
|8 (n = 7,886)||53.7||36.4||6.3||3.5||9.9|
|9 (n = 3,103)||54.4||16.3||26.1||3.1||29.2|
|10 (n = 1,223)||60.1||28.1||10.1||1.6||11.8|
|11 (n = 5,666)||41.5||25.9||28.9||3.6||32.5|
|12 (n = 551)||47.4||26.0||24.7||2.0||26.7|
7.3.2 Subgroup Performance
Data collection for DLM assessments includes demographic data on gender, race, ethnicity, and English learner status. Table 7.7 and Table 7.8 summarize the disaggregated frequency distributions for ELA and mathematics, respectively, collapsed across all assessed grade levels. Although state education agencies each have their own rules for minimum student counts needed to support public reporting of results, small counts are not suppressed here because results are aggregated across states and individual students cannot be identified.
|Two or more races||3,261||49.7||1,514||23.1||1,540||23.5||240||3.7|
|Native Hawaiian or Pacific Islander||135||54.0||53||21.2||54||21.6||8||3.2|
|English learner participation|
|Not EL eligible or monitored||25,188||46.8||12,244||22.7||14,106||26.2||2,325||4.3|
|EL eligible or monitored||1,332||48.9||663||24.3||643||23.6||86||3.2|
|Two or more races||3,628||55.2||1,466||22.3||1,011||15.4||469||7.1|
|Native Hawaiian or Pacific Islander||144||58.3||43||17.4||40||16.2||20||8.1|
|English learner participation|
|Not EL eligible or monitored||28,587||53.1||11,952||22.2||8,692||16.2||4,587||8.5|
|EL eligible or monitored||1,462||53.5||580||21.2||443||16.2||250||9.1|
7.4 Data Files
Data files were made available to DLM state education agencies following the spring 2021 administration. Similar to prior years, the General Research File (GRF) contained student results, including each student’s highest linkage level mastered for each EE and final performance level for the subject for all students who completed any testlets. In addition to the GRF, the DLM Consortium delivered several supplemental files. Consistent with prior years, the special circumstances file provided information about which students and EEs were affected by extenuating circumstances (e.g., chronic absences), as defined by each state. Three new special circumstance codes were available in 2020–2021: 1. Student could not test due to COVID-19; 2. Teacher administered the assessment remotely; and 3. Non-teacher administered. State education agencies also received a supplemental file to identify exited students. The exited students file included all students who exited at any point during the academic year. In the event of observed incidents during assessment delivery, state education agencies are provided with an incident file describing students impacted, however no incidents occurred for ELA and mathematics during 2020–2021. For a description of incidents observed during the 2020–2021 administration, see Chapter 4 of this manual.
Consistent with prior delivery cycles, state partners were provided with a two-week review window following data file delivery to review the files and invalidate student records in the GRF. Decisions about whether to invalidate student records are informed by individual state policy. If changes were made to the GRF, state partners submitted final GRFs via Educator Portal. The final GRF was used to generate score reports.
In addition to the GRF and its supplemental files, states were provided with two additional de-identified data files: a teacher survey data file and a test administration observations data file. The teacher survey file provided state-specific teacher survey responses, with all identifying information about the student and educator removed. The test administration observations file provided test administration observation responses with any identifying information removed. For more information regarding teacher survey content and response rates, see Chapter 4 of this manual. For more information about test administration observation results, see Chapter 9 of this manual.
7.5 Score Reports
The DLM Consortium provides assessment results to all member states to report to parents/guardians, educators, and state and local education agencies. Individual Student Score Reports summarized student performance on the assessment by subject. Several aggregated reports were provided to state and local education agencies, including reports for the classroom, school, district, and state. No changes were made to the structure of aggregated reports during spring 2021. Changes to the Individual Student Score Reports are summarized below. For a complete description of score reports, including aggregated reports, see Chapter 7 of the 2014–2015 Technical Manual—Year-End Model (Dynamic Learning Maps Consortium, 2016).
7.5.1 Individual Student Score Reports
Individual Student Score Reports included a Performance Profile section, which describes student performance in the subject overall. In 2021, a Learning Profile section was added to the reports, which provides detailed reporting of student mastery of individual skills. The Learning Profile section was added due to changes in the year-end blueprint in 2019–2020. A cautionary statement was added to the 2020–2021 Performance Profile and Learning Profile, which indicated that the 2020–2021 academic year was significantly impacted by the COVID-19 pandemic, and mastery results may have reflected the unusual circumstances for instruction and assessment. For more information on validity considerations and scoring and reporting in flexible scenarios, see (A. K. Clark et al., 2021).
Other minor changes to the Individual Student Score Reports included changing the ‘Conceptual Area’ heading in Performance Profile to ‘Area’ to match the ‘Area’ heading on the Learning Profile and adding the area code used in the Learning Profile to the labels used in the Performance Profile for consistency. EEs were reordered in the table in the Learning Profile to match blueprint order, and the hyperlink for the DLM website’s font was increased and moved to the footer before the copyright statement.
7.6 Quality Control Procedures for Data Files and Score Reports
Changes to the quality control procedures were made only to the extent of accommodating the revised score reports for spring 2021 (i.e., checking to be sure changes were correctly and consistently applied). For a complete description of quality control procedures, see Chapter 7 in the 2014–2015 Technical Manual—Year-End Model (Dynamic Learning Maps Consortium, 2016) and Chapter 7 in the 2015–2016 Technical Manual Update—Year-End Model (Dynamic Learning Maps Consortium, 2017a).
Following the spring 2021 administration, five data files were delivered to state partners: GRF, special circumstance code file, exited students file, teacher survey data file, and test administration observations file. No incidents were observed during the 2020–2021 administration, so an incident file was not needed. Overall, between 10% and 43% of students achieved at the At Target or Advanced levels across all grades and subjects. However, these results should be interpreted with caution due to the confounding factors of assessment administration changes and COVID-19. Lastly, minor changes were made to score reports to aid in interpretation.