By Leib Litman, PhD, Aaron Moss, PhD, Cheskie Rosenzweig, MS, Jonathan Robinson, PhD, Zohn Rosen, PhD, Adam Dietrich & Shalom Jaffe
National polls show Joe Biden with a clear and consistent lead over President Donald Trump, meaning Biden is likely to win the popular vote. Winning the Presidency, however, depends on capturing the electoral college votes within critical swing states where polls show Biden with a smaller but important lead. Despite these numbers, not everyone is comfortable concluding that Biden is on his way to a sure win. One reason for this reluctance is the idea that polls may be inaccurate and missing some “shy voters”.
Since 2016 a lot has been written and said about the possibility of “shy Trump voters”. In addition to some large polling misses within specific states in 2016, a few studies have given life to the idea that people who support Trump may be avoiding the polls or, in some cases, refusing to share their honest opinion with pollsters. Many polling experts and the American Association for Public Opinion Research have been skeptical of the idea that “shy Trump voters” exist, and because it’s hard to gather definitive evidence one way or the other the argument has continued.
In August, we conducted a national poll with 1,000 registered voters using an online participant recruitment platform. We asked Democrats, Republicans, and Independents whether they were comfortable disclosing their choice for President in a telephone poll. We found that Republicans and Independents reported more reluctance to share their opinion than Democrats. Specifically, 11.7% of Republicans, 10.5% of Independents, and 5.4% of Democrats told us they were uncomfortable expressing their opinion in phone polls.
Because the 2020 Presidential election hinges on voter turnout within swing states, we sought to replicate our “shy voter” survey within a critical state of the 2020 election: Pennsylvania. From Sept. 23rd to Oct. 6th, we conducted an online survey with Pennsylvania voters and asked about their comfort disclosing their preferred candidate for President in a phone poll. Similar to our national poll, we found 15.9% of Republicans, 19.4% of Independents, and 10.0% of Democrats saying they were not comfortable disclosing their opinion in a phone poll. Beyond this question we also asked people how likely they would be to pick up the phone if they knew a pollster was calling. Independents (51.8%) and Republicans (46.3%) were more likely than Democrats (36.0%) to say they would avoid picking up the phone.
In addition to documenting differences in people’s willingness to participate in phone polls, our survey sought to establish why some voters are more reluctant than others to share their opinions. To this end, we measured people’s fear and concern about the anonymity of the polling process.
Based on the answers we recieved to open-ended questions in our previous (national) study, we developed a scale to measure respondents’ motivations on two dimensions—general avoidance and fear. The first dimension asked about reasons why people may not want to answer phone polls in general. For example, one item read, “I don’t like to answer polls, period.” The Fear dimension asked specifically if respondents were concerned that they could suffer harassment for their opinions, or that the pollster could identify them. The Fear dimension was based on items such as, “People may retaliate against me if they find out my political views.” Both scales are shown below.
What our survey revealed was that people who scored higher on the whole scale, and on either of its subscales, were more likely to report reluctance to participate in phone polls. Further, Republicans scored higher on average than Democrats on all of the scales, indicating that they agree more with the statements (1.58 and 1.35 on the full scale, respectively; where the scale options range from 0 to 4).
Determining how the results of our surveys apply to the 2020 election is difficult. Just because people express concern about sharing their opinions with pollsters doesn’t mean that people actually would decline to participate in a phone poll if they were contacted by a pollster. What our data do show is that people who avoid the polls mostly fall into two categories: they don’t like answering polls in general, or they are nervous that the poll isn’t actually anonymous. We have also shown that Republicans are more likely to have these concerns than Democrats, nationally and in Pennsylvania. But it remains to be seen whether and to what extent this may be influencing the polls.
Interviews were collected from an online sample of 1,000 likely voters in Pennsylvania, matched to U.S. Census data on gender, age, race, and region. Data were collected September 23 – October 6, 2020 using CloudResearch’s Prime Panels, a platform that incorporates data quality checks in sample recruitment and sources participants from an aggregate of online research panels (Chandler et al., 2019).
Prior to entering the survey, all respondents were screened for attentiveness and English language comprehension. Following the screening process a responsive verification protocol was used, consisting of multiple interactive interview steps. Step 1 – Respondents were first asked “If you were asked in an automated telephone poll (also called “Interactive Voice Response” or “IVR”), would you give your true opinion about which presidential candidate you are likely to vote for?”. We also wanted to compare how comfortable respondents are in responding to telephone polls, compared to online and in-person polls. We therefore also asked the same question about phone interviews, online surveys, and in-person polls, presented next to each other in the same grid. The response options to all questions were: 1. “I would definitely express my true opinion”, 2. “I would probably express my true opinion”, 3. “I would probably not express my true opinion” and 4. “I would definitely not express my true opinion”.
Step 2 – For those who indicated that they would not provide their true opinion on either question from Step 1 (i.e. responded with options 3 or 4), a follow-up question was asked: “On a previous question, you indicated that you may not express your true opinion over the phone about which presidential candidate you support. Sometimes people click on a button by accident, so we just wanted to verify your response. Are you hesitant to disclose your support for a presidential candidate on a phone survey?”. The response options to the verification questions were as follows: 1) “I most likely would not express my true opinion about which presidential candidate I support on a phone survey.” 2) “I most likely would express my true opinion about which presidential candidate I support on a phone survey.” Only people who verified their response in Step 2 were counted as being reluctant to share their opinion about which presidential candidate they would support on a phone survey.
Step 3 – Those who verified their response in Step 2, were prompted to provide more context about their answer on an open-ended response : “Can you explain why not? Specifically, why would you most likely not express your true opinion in a phone survey?”.
We evaluated the association between support for a presidential candidate and reluctance to share opinions on phone surveys (phone sharing reluctance) using logistic regression-based modeling. We began by looking at a simple bivariate association between which candidate the respondents supported and phone sharing reluctance. We found that Trump supporters are significantly more reluctant to share their opinions on phone surveys compared to Biden supporters, p. < .001.
The association between which candidate the respondents supported and phone sharing reluctance may, however, be confounded by a number of demographic factors such as age and education. To address these potential confounds, we modeled the association between which candidate the respondents supported and phone sharing reluctance with the inclusion of multiple demographic covariates in the logistic regression model, including gender, age, race, ethnicity (as per the methodology adopted by the US Census, Latino/a ethnicity was measured separately from race and was thus entered as a separate covariate in the regression model), marital status, family composition, education, and region. Heteroskedasticity was controlled in the regression model by using robust standard errors.
Even after the inclusion of the covariates in the model, Trump supporters were significantly more reluctant to share their opinions on phone surveys compared to Biden supporters (p. < .001).
As with all studies, there are several limitations to our methods that should be kept in mind. The most important limitation is that this is self-report data, which may not perfectly align with people’s real world behavior. Additionally, it is unclear how the weighting methods used in specific polls may attenuate the “shy voter” effect that may exist in the population.