Statewide Summative Assessment Results, 2020-21


Special Considerations and Analyses Due to the COVID-19 Pandemic

When viewing and interpreting the results for summative assessments administered in 2020–21 – especially when making comparisons with 2018–19 and earlier exam administrations – it is important to note the similarities to and differences from previous years. Notably, this year’s exams used the same test blueprint, the same item bank or the same/similar test forms, and the same in-person testing protocols as in 2018–19. Unfortunately, the list of differences is significantly longer. First, schools were fully remote from mid-March to mid-June 2020, a marked difference from prior years. In 2020–21, school learning models changed throughout the school year and students were remote to varying degrees due to factors beyond educator control. In addition, some students tested remotely, which was a new construct for 2020–21. In-person school didn’t look the same, and new instructional approaches emerged (e.g., concurrent teaching, remote academies). Students and educators expressed feelings of general stress, anxiety, and trauma.
Attendance data from 2020-21 confirm that how students learned (i.e., in-person, hybrid, remote) varied across districts and schools. In addition, who and how many learned remotely varied across districts and schools. Who and how many tested remotely (a new construct for the state assessment) varied across districts and schools. Finally, who and how many participated in the in-person test also varied across districts and schools. For some of these reasons, district-level data in 2020-21 are not directly comparable to each other as in prior years. In anticipation of these comparability concerns, the U.S. Department of Education approved Connecticut’s request to waive district/school accountability for the 2020-21 year. To learn about the impact of the pandemic on student achievement and growth, the CSDE conducted specialized analyses at the state level.
  • The CSDE used “matched cohort growth” (i.e., growth of same students from one grade to another) when feasible to evaluate how growth during the pandemic was different from growth before the pandemic.
  • All results are disaggregated by a student’s learning model: fully/mostly in-person (more than 75% of days in-person); hybrid (between 25% and 75% of days in-person); or fully/mostly remote (less than 25% of days in-person).
  • Only those scores from students who tested in-person were included.
  • Lastly, given the variations in learning models and test participation across student groups, comparisons are made within student groups (e.g., Students with High Needs*).


The CSDE’s specialized analyses of data from assessments administered in-person reveal the following:

  • In all grades and across most student groups, those who learned in-person during the 2020-21 school year lost the least ground academically.
  • Those who learned in hybrid or remote models showed substantially weaker achievement and growth during the pandemic.
  • While academic impacts are seen in all subjects, the observed differences are largest in mathematics.

The full report provides additional information and analyses including longitudinal growth trajectories, growth model estimations, and remote test administrations.

Figure 1 presents the proficiency rates (i.e., the percent at level 3 or 4) of students in Grades 5-8 in 2020-21 on the Smarter Balanced assessments as compared to their achievement, two grades prior, in 2018-19. The results reveal that achievement in 2020-21 for these matched students was substantially lower than in 2018-19, especially in Math, and especially for those who learned in hybrid or remote models. This pattern is seen for students with or without high needs. Similar patterns of greater declines among remote students are seen when data are disaggregated by race/ethnicity. Black/African American and Hispanic/Latino students experienced the largest gaps in proficiency rates pre- and post-pandemic. For all race/ethnicity groups, gaps were larger in math than English language arts (ELA).

Figure 1: Matched Cohort (2018-19 to 2020–21) Proficiency Rates by High Needs Status (Grades 5-8) 
Graph of Matched Cohort Data

Unlike students in Grades 5-8 who tested in ELA and Math, there are no prior scores on the same assessment for students:

  • in Grades 3 and 4;
  • in Grades 5, 8, or 11 who took the Next Generation Science Standards (NGSS); and
  • in Grade 11 who took the Connecticut SAT School Day.

Figure 2 compares the ELA and Math performance of students in Grade 3 in 2020-21 based on their learning model to the results of all Grade 3 students in 2018-19. Similar to the Grades 5-8 matched cohort results, students in Grade 3 who learned in-person have higher achievement that is closest to the students who tested in 2018-19 while those who learned in a hybrid or remote model reflect lower achievement. Again, the differences are greater in Math than ELA, and this pattern holds true among students with or without high needs. Similar patterns are seen in Grade 4.


Figure 2. ELA and Math Achievement in Grade 3
Unmatched student data that does not account for any prior differences between the groups

Graph of Grade 3 Smarter Balanced Data


The results from the NGSS (Figure 3) and the Connecticut SAT School Day (Figure 4) assessments also reveal lower achievement among those who learned in hybrid or remote models as compared to those who learned in-person.


Figure 3: Science Achievement on the NGSS (Grades 5, 8, 11)
Caution: Unmatched student data and low participation among fully/mostly remote learners and students with high needs in Grade 11

Graph of NGSS Data

Figure 4: CT SAT School Day (Grade 11)

Caution: Unmatched student data and low participation among fully/mostly remote learners and students with high needs  

Graph of SAT Data

Achievement Data Files (field descriptions for all files)

*A student is included in the high needs group if they are an English learner, a student with a disability, and/or a student from a low-income family.