7 Assessment Results

Chapter 7 of the Dynamic Learning Maps® (DLM®) Alternate Assessment System 2014–2015 Technical Manual—Year-End Model (Dynamic Learning Maps Consortium, 2016) describes assessment results for the 2014–2015 academic year, including student participation and performance summaries, and an overview of data files and score reports delivered to state education agencies. Technical Manual updates provide a description of data files, score reports, and results for each corresponding academic year.

This chapter presents 2020–2021 student participation data; the percentage of students achieving at each performance level; and subgroup performance by gender, race, ethnicity, and English learner status. This chapter also reports the distribution of students by the highest linkage level mastered during 2020–2021. Finally, this chapter describes updates made to score reports during the 2020–2021 operational year. For a complete description of score reports and interpretive guides, see Chapter 7 of the 2014–2015 Technical Manual—Year-End Model (Dynamic Learning Maps Consortium, 2016).


In this chapter we describe the results that were reported as part of the 2020–2021 assessment administration. However, due to the confounding factors of assessment administration changes and COVID-19, these results should be interpreted with caution and should not be directly compared to previous assessment administrations.


7.1 Impacts to Assessment Administration

There were multiple factors that potentially impacted assessment administration and performance during the 2020–2021 school year. First were changes that were originally implemented in the 2019–2020 school year, which first took effect in 2020–2021 due to the cancellation of testing in spring 2020. This included the adoption of a revised blueprint for both ELA and mathematics, which resulted in a reduction in the number of Essential Elements (EEs) on which students are assessed (as described in Chapter 3 of this manual). The blueprint revision was accompanied by a change in the operational item pool from testlets that measure multiple EEs to testlets that each measure only one EE, allowing each EE to be measured by more items. The blueprint revision also required an adjustment to the cut points used to determine the overall performance level in each subject, as described in Chapter 6 of this manual.

In addition, the 2020–2021 school year was significantly impacted by COVID-19. Overall, participation in DLM assessments across all states was lower than what would typically be expected. This decrease was not uniform across demographic subgroups. White students made up a larger percentage of the student population in 2020–2021 than in prior years, whereas African American students, students of Hispanic ethnicity, and English learners made up a smaller percentage of the student population. There were also fewer students who were placed in the Foundational and Band 3 complexity bands, which are used to determine the starting linkage level in each subject (see Chapter 4 of this manual for a description of linkage level assignment). Further, data from the spring teacher survey indicated that students may have had less opportunity to learn, and that many students experienced difficulty with remote learning.

For a complete discussion of student performance and the potential impacts of assessment administration changes and COVID-19, see (Accessible Teaching, Learning, and Assessment Systems, 2021).

7.2 Student Participation

During spring 2021, assessments were administered to 57,611 students in 14 states Two states chose to extend their testing window through September 2021. For these states, results are included for students who completed their assessments by the close of the standard consortium testing window on July 2, 2021.. Counts of students tested in each state are displayed in Table 7.1. The assessments were administered by 18,358 educators in 10,124 schools and 3,551 school districts.

Table 7.1: Student Participation by State (N = 57,611)
State Students (n)
Alaska 336
Colorado 3,100
Delaware 636
Illinois 7,511
New Hampshire 620
New Jersey 7,080
New Mexico 136
New York 12,367
Oklahoma 4,690
Pennsylvania 11,641
Rhode Island 820
Utah 3,608
West Virginia 1,286
Wisconsin 3,780

Table 7.2 summarizes the number of students assessed in each grade. In grades 3 through 8, over 7,600 students participated in each grade. In high school, the largest number of students participated in grade 11, and the smallest number participated in grade 12. The differences in high school grade-level participation can be traced to differing state-level policies about the grade(s) in which students are assessed.

Table 7.2: Student Participation by Grade (N = 57,611)
Grade Students (n)
3 7,628
4 7,814
5 7,701
6 8,039
7 7,904
8 7,929
9 3,117
10 1,226
11 5,695
12 558

Table 7.3 summarizes the demographic characteristics of the students who participated in the spring 2021 administration. The majority of participants were male (68%) and white (65%). About 5% of students were monitored or eligible for English learner services.

Table 7.3: Demographic Characteristics of Participants (N = 57,611)
Subgroup n %
Gender
Male 39,255 68.1
Female 18,356 31.9
Race
White 37,168 64.5
African American 9,382 16.3
Two or more races 6,662 11.6
Asian 2,587 4.5
American Indian 1,469 2.5
Native Hawaiian or Pacific Islander 254 0.4
Alaska Native 89 0.2
Hispanic ethnicity
No 45,871 79.6
Yes 11,740 20.4
English learner participation
Not EL eligible or monitored 54,849 95.2
EL eligible or monitored 2,762 4.8

In addition to the spring administration, instructionally embedded assessments are also made available for teachers to administer to students during the year. Results from these assessments do not contribute to final summative scoring but can be used to guide instructional decision-making. Table 7.4 summarizes the number of students participating in instructionally embedded testing by state. A total of 242 students, 11 states, took at least one instructionally embedded testlet during the 2020–2021 academic year.

Table 7.4: Students Completing Instructionally Embedded Testlets by State (N = 242)
State n
Colorado 6
Delaware 8
Illinois 1
New Hampshire 12
New Jersey 22
New York 5
Oklahoma 175
Rhode Island 3
Utah 6
West Virginia 2
Wisconsin 2

Table 7.5 summarizes the number of instructionally embedded testlets taken in ELA and mathematics. Across 11 states, students took 2,234 ELA testlets and 2,244 mathematics testlets.

Table 7.5: Number of Instructionally Embedded Testlets by Grade
Grade English language arts Mathematics
3 318 301
4 360 376
5 302 213
6 322 247
7 361 350
8 391 331
9 8 7
10 1 0
11 155 391
12 16 28
Total 2,234 2,244

7.3 Student Performance

Student performance on DLM assessments is interpreted using cut points, determined during standard setting, which categorize student results into four performance levels. For a full description of the standard-setting process, see Chapter 6 of the 2014–2015 Technical Manual—Year-End Model (Dynamic Learning Maps Consortium, 2016). Following changes to the assessment blueprint in 2019–2020, a standards adjustment process was used to update the cut points in 2020–2021. For a description of the standards adjustment process, see Chapter 6 of this manual. A student’s performance level is determined based on the total number of linkage levels mastered across the assessed Essential Elements (EEs).

For the spring 2021 administration, student performance was reported using the same four performance levels approved by the DLM Consortium for prior years:

  • The student demonstrates Emerging understanding of and ability to apply content knowledge and skills represented by the EEs.
  • The student’s understanding of and ability to apply targeted content knowledge and skills represented by the EEs is Approaching the Target.
  • The student’s understanding of and ability to apply content knowledge and skills represented by the EEs is At Target. This performance level is considered to be meeting achievement expectations.
  • The student demonstrates Advanced understanding of and ability to apply targeted content knowledge and skills represented by the EEs.

7.3.1 Overall Performance

Table 7.6 reports the percentage of students achieving at each performance level from the spring 2021 administration for ELA and mathematics. For ELA, the percentage of students who achieved at the At Target or Advanced levels ranged from approximately 23% to 35%. In mathematics, the percentage of students meeting or exceeding Target expectations ranged from approximately 10% to 43%.

Table 7.6: Percentage of Students by Grade and Performance Level
Grade Emerging (%) Approaching (%) Target (%) Advanced (%) Target+ Advanced (%)
English language arts
3 (n = 7,600) 58.7 10.7 28.2 2.4 30.7
4 (n = 7,595) 59.3 18.2 20.3 2.3 22.5
5 (n = 7,677) 53.5 11.8 30.1 4.6 34.7
6 (n = 7,728) 43.6 22.4 25.3 8.8 34.1
7 (n = 7,839) 43.8 25.6 24.6 6.0 30.6
8 (n = 7,646) 41.9 31.4 26.3 0.4 26.7
9 (n = 3,107) 33.9 34.1 26.8 5.2 32.0
10 (n = 1,224) 29.5 42.7 27.1 0.7 27.8
11 (n = 5,629) 32.8 33.8 27.6 5.7 33.3
12 (n = 542) 33.6 34.7 27.3 4.4 31.7
Mathematics
3 (n = 7,356) 62.3 14.9 14.1 8.7 22.8
4 (n = 7,767) 47.7 9.5 22.9 19.9 42.8
5 (n = 7,374) 42.7 27.6 17.0 12.7 29.7
6 (n = 8,012) 56.6 22.6 11.9 8.9 20.8
7 (n = 7,615) 63.1 19.9 11.9 5.1 17.0
8 (n = 7,886) 53.7 36.4 6.3 3.5 9.9
9 (n = 3,103) 54.4 16.3 26.1 3.1 29.2
10 (n = 1,223) 60.1 28.1 10.1 1.6 11.8
11 (n = 5,666) 41.5 25.9 28.9 3.6 32.5
12 (n = 551) 47.4 26.0 24.7 2.0 26.7

7.3.2 Subgroup Performance

Data collection for DLM assessments includes demographic data on gender, race, ethnicity, and English learner status. Table 7.7 and Table 7.8 summarize the disaggregated frequency distributions for ELA and mathematics, respectively, collapsed across all assessed grade levels. Although state education agencies each have their own rules for minimum student counts needed to support public reporting of results, small counts are not suppressed here because results are aggregated across states and individual students cannot be identified.

Table 7.7: ELA Performance Level Distributions by Demographic Subgroup (N = 56,587)
Emerging
Approaching
Target
Advanced
Subgroup n % n % n % n %
Gender
Male 18,213 47.2 8,742 22.7 10,001 25.9 1,632 4.2
Female 8,307 46.2 4,165 23.1 4,748 26.4 779 4.3
Race
White 16,657 45.7 8,329 22.8 9,807 26.9 1,663 4.6
African American 4,341 46.9 2,133 23.0 2,423 26.2 360 3.9
Two or more races 3,261 49.7 1,514 23.1 1,540 23.5 240 3.7
Asian 1,487 58.2 493 19.3 490 19.2 87 3.4
American Indian 584 41.0 371 26.1 417 29.3 52 3.7
Native Hawaiian or Pacific Islander 135 54.0 53 21.2 54 21.6 8 3.2
Alaska Native 55 62.5 14 15.9 18 20.5 1 1.1
Hispanic ethnicity
No 20,952 46.3 10,271 22.7 12,006 26.5 2,001 4.4
Yes 5,568 49.0 2,636 23.2 2,743 24.2 410 3.6
English learner participation
Not EL eligible or monitored 25,188 46.8 12,244 22.7 14,106 26.2 2,325 4.3
EL eligible or monitored 1,332 48.9 663 24.3 643 23.6 86 3.2
Table 7.8: Mathematics Performance Level Distributions by Demographic Subgroup (N = 56,553)
Emerging
Approaching
Target
Advanced
Subgroup n % n % n % n %
Gender
Male 20,184 52.4 8,341 21.6 6,370 16.5 3,653 9.5
Female 9,865 54.8 4,191 23.3 2,765 15.4 1,184 6.6
Race
White 19,183 52.7 8,141 22.4 5,933 16.3 3,144 8.6
African American 4,858 52.3 2,076 22.4 1,528 16.5 820 8.8
Two or more races 3,628 55.2 1,466 22.3 1,011 15.4 469 7.1
Asian 1,530 60.1 449 17.6 341 13.4 224 8.8
American Indian 648 45.8 338 23.9 274 19.4 156 11.0
Native Hawaiian or Pacific Islander 144 58.3 43 17.4 40 16.2 20 8.1
Alaska Native 58 65.2 19 21.3 8 9.0 4 4.5
Hispanic ethnicity
No 23,885 52.9 10,087 22.3 7,375 16.3 3,838 8.5
Yes 6,164 54.2 2,445 21.5 1,760 15.5 999 8.8
English learner participation
Not EL eligible or monitored 28,587 53.1 11,952 22.2 8,692 16.2 4,587 8.5
EL eligible or monitored 1,462 53.5 580 21.2 443 16.2 250 9.1

7.3.3 Linkage Level Mastery

As described earlier in the chapter, overall performance in each subject is calculated based on the number of linkage levels mastered across all EEs. Results indicate the highest linkage level the student mastered for each EE. The linkage levels are (in order): Initial Precursor, Distal Precursor, Proximal Precursor, Target, and Successor. A student can be a master of zero, one, two, three, four, or all five linkage levels, within the order constraints. For example, if a student masters the Proximal Precursor level, they also master all linkage levels lower in the order (i.e., Initial Precursor and Distal Precursor). This section summarizes the distribution of students by highest linkage level mastered across all EEs. For each student, the highest linkage level mastered across all tested EEs was calculated. Then, for each grade and subject, the number of students with each linkage level as their highest mastered linkage level across all EEs was summed and then divided by the total number of students who tested in the grade and subject. This resulted in the proportion of students for whom each level was the highest level mastered.

Table 7.9 and Table 7.10 report the percentage of students who mastered each linkage level as the highest linkage level across all EEs for ELA and mathematics, respectively. For example, across all third grade ELA EEs, the Initial Precursor level was the highest level that students mastered 11% of the time. For ELA, the average percentage of students who mastered as high as the Target or Successor linkage level across all EEs ranged from approximately 46% in grade 3 to 59% in grade 11. For mathematics, the average percentage of students who mastered the Target or Successor linkage level across all EEs ranged from approximately 18% in grade 10 to 48% in grade 4.

Table 7.9: Students’ Highest Linkage Level Mastered Across ELA EEs by Grade
Linkage Level
Grade No evidence (%) IP (%) DP (%) PP (%) T (%) S (%)
3 (n = 7,600) 4.5 11.1 28.7 9.5 7.4 38.9
4 (n = 7,595) 4.9 5.3 27.4 13.6 10.3 38.6
5 (n = 7,677) 3.7 7.8 27.9 4.3 12.6 43.7
6 (n = 7,728) 3.9 6.9 22.8 15.6 4.8 46.0
7 (n = 7,839) 3.3 4.6 31.1 9.3 7.5 44.3
8 (n = 7,646) 4.5 6.3 24.3 12.5 15.4 36.9
9 (n = 3,107) 4.9 8.5 19.4 9.3 14.1 43.8
10 (n = 1,224) 3.8 7.7 20.8 10.2 18.0 39.5
11 (n = 5,629) 3.2 9.9 16.5 11.1 14.1 45.1
12 (n = 542) 3.3 9.8 18.1 12.4 12.7 43.7
Note. IP = Initial Precursor; DP = Distal Precursor; PP = Proximal Precursor; T = Target; S = Successor.
Table 7.10: Students’ Highest Linkage Level Mastered Across Mathematics EEs by Grade
Linkage Level
Grade No evidence (%) IP (%) DP (%) PP (%) T (%) S (%)
3 (n = 7,356) 3.8 45.2 13.4 17.3 6.0 14.3
4 (n = 7,767) 5.7 27.3 12.2 6.4 20.3 28.0
5 (n = 7,374) 6.2 25.6 12.8 28.8 9.0 17.7
6 (n = 8,012) 8.9 27.1 11.3 28.0 6.7 17.9
7 (n = 7,615) 9.2 32.3 19.1 16.4 10.7 12.4
8 (n = 7,886) 10.9 15.3 23.0 28.9 11.2 10.7
9 (n = 3,103) 10.8 21.2 19.2 12.1 20.2 16.5
10 (n = 1,223) 7.0 33.8 24.9 16.8 10.5 7.1
11 (n = 5,666) 10.7 18.3 28.0 16.4 15.2 11.4
12 (n = 551) 10.7 22.1 30.3 16.0 14.2 6.7
Note. IP = Initial Precursor; DP = Distal Precursor; PP = Proximal Precursor; T = Target; S = Successor.

7.4 Data Files

Data files were made available to DLM state education agencies following the spring 2021 administration. Similar to prior years, the General Research File (GRF) contained student results, including each student’s highest linkage level mastered for each EE and final performance level for the subject for all students who completed any testlets. In addition to the GRF, the DLM Consortium delivered several supplemental files. Consistent with prior years, the special circumstances file provided information about which students and EEs were affected by extenuating circumstances (e.g., chronic absences), as defined by each state. Three new special circumstance codes were available in 2020–2021: 1. Student could not test due to COVID-19; 2. Teacher administered the assessment remotely; and 3. Non-teacher administered. State education agencies also received a supplemental file to identify exited students. The exited students file included all students who exited at any point during the academic year. In the event of observed incidents during assessment delivery, state education agencies are provided with an incident file describing students impacted, however no incidents occurred for ELA and mathematics during 2020–2021. For a description of incidents observed during the 2020–2021 administration, see Chapter 4 of this manual.

Consistent with prior delivery cycles, state partners were provided with a two-week review window following data file delivery to review the files and invalidate student records in the GRF. Decisions about whether to invalidate student records are informed by individual state policy. If changes were made to the GRF, state partners submitted final GRFs via Educator Portal. The final GRF was used to generate score reports.

In addition to the GRF and its supplemental files, states were provided with two additional de-identified data files: a teacher survey data file and a test administration observations data file. The teacher survey file provided state-specific teacher survey responses, with all identifying information about the student and educator removed. The test administration observations file provided test administration observation responses with any identifying information removed. For more information regarding teacher survey content and response rates, see Chapter 4 of this manual. For more information about test administration observation results, see Chapter 9 of this manual.

7.5 Score Reports

The DLM Consortium provides assessment results to all member states to report to parents/guardians, educators, and state and local education agencies. Individual Student Score Reports summarized student performance on the assessment by subject. Several aggregated reports were provided to state and local education agencies, including reports for the classroom, school, district, and state. No changes were made to the structure of aggregated reports during spring 2021. Changes to the Individual Student Score Reports are summarized below. For a complete description of score reports, including aggregated reports, see Chapter 7 of the 2014–2015 Technical Manual—Year-End Model (Dynamic Learning Maps Consortium, 2016).

7.5.1 Individual Student Score Reports

Individual Student Score Reports included a Performance Profile section, which describes student performance in the subject overall. In 2021, a Learning Profile section was added to the reports, which provides detailed reporting of student mastery of individual skills. The Learning Profile section was added due to changes in the year-end blueprint in 2019–2020. A cautionary statement was added to the 2020–2021 Performance Profile and Learning Profile, which indicated that the 2020–2021 academic year was significantly impacted by the COVID-19 pandemic, and mastery results may have reflected the unusual circumstances for instruction and assessment. For more information on validity considerations and scoring and reporting in flexible scenarios, see (A. K. Clark et al., 2021).

Other minor changes to the Individual Student Score Reports included changing the ‘Conceptual Area’ heading in Performance Profile to ‘Area’ to match the ‘Area’ heading on the Learning Profile and adding the area code used in the Learning Profile to the labels used in the Performance Profile for consistency. EEs were reordered in the table in the Learning Profile to match blueprint order, and the hyperlink for the DLM website’s font was increased and moved to the footer before the copyright statement.

A sample Performance Profile and a sample Learning Profile reflecting the 2020–2021 changes are provided in Figure 7.1 and Figure 7.2.

Figure 7.1: Example Page of the Performance Profile for 2020–2021.

Example Page of the Performance Profile for 2020–2021.

Figure 7.2: Example Page of the Learning Profile for 2020–2021.

Example Page of the Learning Profile for 2020–2021.

7.6 Quality Control Procedures for Data Files and Score Reports

Changes to the quality control procedures were made only to the extent of accommodating the revised score reports for spring 2021 (i.e., checking to be sure changes were correctly and consistently applied). For a complete description of quality control procedures, see Chapter 7 in the 2014–2015 Technical Manual—Year-End Model (Dynamic Learning Maps Consortium, 2016) and Chapter 7 in the 2015–2016 Technical Manual Update—Year-End Model (Dynamic Learning Maps Consortium, 2017a).

7.7 Conclusion

Following the spring 2021 administration, five data files were delivered to state partners: GRF, special circumstance code file, exited students file, teacher survey data file, and test administration observations file. No incidents were observed during the 2020–2021 administration, so an incident file was not needed. Overall, between 10% and 43% of students achieved at the At Target or Advanced levels across all grades and subjects. However, these results should be interpreted with caution due to the confounding factors of assessment administration changes and COVID-19. Lastly, minor changes were made to score reports to aid in interpretation.