Earlier this year, in late June and early August, we conducted two different surveys to a random population of registered voters (first in California, and then nationally). We used text messages to invite participants to complete a survey online. Between those two surveys, Joe Biden dropped out of the U.S. presidential election and Kamala Harris replaced him as the presumptive Democratic Party presidential nominee. Additionally, Donald Trump survived an assassination attempt and then selected JD Vance as his running mate.
Given this unprecedented period of upheaval in American politics, we wondered whether we could compare the respondents of the surveys to better understand patterns of partisan nonresponse bias in this fluid environment. To preface, we know these are different sampling frames. We’re primarily interested in whether different people were more likely to respond to our second survey if attempted.
The key takeaways are:
- Survey open rates were higher across our partisanship scores when comparing the August and June surveys, but we saw a steep dropoff in survey completion rates between the June and August surveys among Republicans.
- Younger voters were more likely to complete our survey in August than June, while older voters were more likely to complete our survey in June than August.
What happened to open and response rates?
The two plots below show the survey open and completion rates between the two surveys. The bars represent the outcome average for each decile in our partisanship score for each survey. Red represents the June survey, while blue represents the August survey. In the plot below, we can see that survey open rates were higher or about the same across our partisanship score in August.
However, Republicans were much less likely to actually complete the August survey. Starting from the low end of partisanship model, Republicans were much likelier to answer in June than in August across the board. Since the two surveys use different sampling frames, we can’t draw firm conclusions from this – but it echoes a trend seen in national surveys in which Republicans’ relative enthusiasm about the election was higher in June, before the nomination of Kamala Harris.
We then dug into age given that much public polling has shown large shifts in support among younger voters. We found a similar story there. Voters in the 18 to 25 and 26 to 35 age bins were much likelier to respond in August versus June. Whereas other voters were either as likely or less likely to respond to our August survey.
How did that affect our sample?
The next question is how did those biases affect our survey’s final sample composition? Surprisingly, both very Democratic and very Republicanvoters were more represented in the August sample. However, somewhat Republican voters were more likely to be represented in the August sample, while some Democratic voters were somewhat less likely to be represented in the August sample.
However, we saw large shifts when it came to age. Voters between 18 to 45 were more likely to be represented in the August sample, and voters 66 and older were significantly less likely to be in the August sample.