U.S. flag

An official website of the United States government

Skip Header


Pandemic Impact on 2020 American Community Survey 1-Year Data

Written by:

Estimated reading time: 6 minutes 

In 2020, the COVID-19 pandemic disrupted the lives of people across the country. It also significantly disrupted data collection for the Census Bureau’s American Community Survey (ACS) — one of the nation’s most comprehensive sources of population and housing information about the U.S.  

These disruptions hindered our ability to collect quality data from both households and group quarters. (We described the disruptions and how we adapted in the recent blogs Adapting the American Community Survey Amid COVID-19 and Collecting American Community Survey Data From Group Quarters Amid the Pandemic.)  

In July, we announced that we would not release the standard 2020 1-year ACS data products and instead would release a set of 1-year estimates using experimental weights. Today, we released an analytical report, An Assessment of the COVID-19 Pandemic’s Impact on the 2020 ACS 1-year Data, detailing that decision.  

The report documents how the challenges in collecting responses significantly impaired the quality of the resulting estimates, which were often inconsistent with benchmarks and administrative data, or changed in unexpected magnitudes. These inconsistencies signaled a serious quality issue in the 2020 ACS 1-year data.  

In this blog, we highlight the report. 

Modifications to the ACS Weighting Procedures

One of our immediate quality concerns was that people who responded to the survey had significantly different social, economic, and housing characteristics than those who didn’t respond. This can lead to “nonresponse bias.” 

Often statisticians can adjust for nonresponse bias by giving more weight to responses from underrepresented groups, which is what we attempted to do. With the disruptions to data collection operations, we started to assess possible modifications to our weighting procedures. For example, we built on work from the decennial census in predictive modeling of vacant housing units to change how the ACS weights vacant housing units. 

Through changes to our standard weighting process, we had hoped we could rectify some of the nonresponse bias introduced in the 2020 ACS data due to the pandemic. However, we found that these changes could not address the bias in a way that meets Census Bureau quality standards. 

Data Quality Measures

Each year, the ACS produces four broad categories of data quality measures, which describe the overall data quality as well as the quality for specific subgroups or categories. Given the challenges to data collection in 2020, we took a close look at each of the categories: 

  • Sample size and interview counts. 
  • Coverage rates. 
  • Response rates. 
  • Item allocation rates. 

The sample size and interview counts provide an overall measure of the size of our housing unit or group quarters samples in all geographies.  

Coverage rates measure how well the interviewed sample covers both housing units and the population, both as a whole and for demographic groups. 

Response rates are calculated as the weighted ratio of units interviewed to the estimate of units that should have been interviewed. 

Item allocation rates measure the completeness of the data collected from respondents. 

The report released today provides results for each of these four metrics. 

Analysis of Selected Characteristics

The report also details widespread issues observed during our data review across social, economic, and housing characteristics. We observed substantial changes among characteristics that do not traditionally change much year-to-year and among characteristics that diverged from other data sources. Those characteristics include: 

  • Building structure type.  
  • Marital status.  
  • Educational attainment.  
  • Medicaid coverage. 
  • Noncitizen population.  
  • Median household income. 

 Changes might be expected among some of these estimates given the magnitude of economic and demographic shocks the U.S. experienced when the COVID-19 pandemic hit. However, the complexities of the ACS survey design, coupled with the disruptions to data collection, made it difficult to assess the reasonableness of these changes using our regular data review tools or processes.  

For some topics, we have timely and reliable external benchmarks. Using these data can inform the comparison to the ACS 1-year estimates. For example, the Centers for Medicare & Medicaid Services (CMS) releases monthly enrollment data, which provide a specific reference for the ACS estimates. Historically, the levels of Medicaid enrollment in the CMS and ACS data have differed, but the year-to-year trends have been consistent. This year, however, we noticed a divergence between the ACS 1-year estimates and the CMS data, compelling evidence of an issue with the ACS.   

Additional characteristics such as marital status, educational attainment, building structures, and the size of the foreign-born population tend not to change dramatically from one year to the next. Large year-to-year changes in any of these estimates would call into question the reasonableness of the data, particularly the ability of the weighting procedures to correct for nonresponse bias, but this pattern is what we observed in the 2020 estimates. 

The selected topics are not the only ones about which analysts raised concerns. As a consequence of the size and characteristics of the noninterviewed population, the interviewed sample was more economically advantaged than in earlier years. At face value, the 2020 ACS 1-year data made it appear that the U.S. population had higher levels of education, more married couples, fewer never-married individuals, higher median household incomes, fewer non-citizens, and was more likely to live in single-family housing units. Amid a pandemic that negatively affected so many in 2020, these data strongly imply that the respondents were not nationally representative. We noted these findings at the national level. At lower levels of geography, we anticipate finding additional and more extreme instances of unlikely findings.  

Summary

The pandemic and the resulting change in survey operations affected the quality of the 2020 ACS 1-year estimates. Due to the necessary constraints on data collection, it became evident that our standard weighting methodology would not be adequate to the varying data collection strategies employed throughout the year.  

It was clear that each month’s interviews differed not only in number, but also in the population we were able to reach and have respond to the survey. Therefore, the ACS began to look less like a continual monthly survey stemming from a common design and more like 12 independent monthly surveys, each with its own data collection strategy. 

Despite our best efforts to mitigate the collection disruptions and modify the weighting adjustments, the outcome could not be fully evaluated until data collection ended. Unfortunately, even with modifications focusing on known sources of bias, the Census Bureau determined that the estimates did not meet our statistical quality standards. These inconsistencies led to the Census Bureau’s decision not to release the standard set of 1-year data products.  

In November, the Census Bureau will release a set of experimental estimates for the ACS 1-year data as well as a technical working paper describing the research. These estimates use a new weighting method to attempt to mitigate the weighting and other issues discussed in the analytical report. The ACS 1-year experimental data may not meet all our quality standards. Because of this, we clearly identify experimental data products and include methodology and supporting research with their release.

Top

Back to Header