Study: Are Election 2020 Poll Respondents Honest About Their Vote?

Leib Litman, PhD

By Leib Litman, PhD, Zohn Rosen, PhD, Cheskie Rosenzweig, MS, Aaron Moss, PhD, Adam Dietrich, & Jonathan Robinson, PhD

Background

Lately, there’s been considerable debate over the accuracy of presidential polls. While recent polls show Joe Biden ahead, a number of pundits speculate that some Donald Trump supporters may be hesitant to share their true opinions when polled by phone. That hypothesis is gaining traction, leading some to argue that Trump may be leading despite what the latest numbers show. It’s also being fueled by the belief that 2020 will be a repeat of the 2016 election, when Trump polled poorly in advance of the election, but still went on to win the Electoral College vote. 

Despite the current debate on whether there are segments of Trump (or Biden) backers reticent to express their true opinions in phone polls, there’s been little empirical investigation into if the phenomenon actually exists. Pundits on major broadcast and cable news networks, such as Fox News and CNN, continue to speculate on the potential impact of so-called “shy Trump voters” on the outcome of the November presidential election result.  In a recent article published in The New York Times, David Winston says the following: 

“The idea that people lie, it’s an interesting theory, and it’s not like it’s completely off-the-wall. But it’s obviously a very complicated thing to try to prove because what do you do? Ask them, ‘Are you lying?’”

On its face, a poll that asks people whether they lie in phone polls may not make much sense. After all, why would a person who lies in a telephone poll tell the truth about that in a different online poll? We here at CloudResearch reasoned that the issue may be methodological.

To explore this issue, we structured our survey to overcome shortcomings of previous polls. Instead of simply asking voters whom they will vote for — and then ask whether they just lied — we centered our research around a general question: “Are you comfortable in truthfully disclosing the presidential candidate you intend to vote for in a telephone poll?” Our rationale for this approach was that there’s a major difference between admitting you just lied and admitting to being genuinely concerned about disclosing your preferred candidate. 

For the most part, we expected to find very few “shy voters.” After all, telephone surveys are supposed to be anonymous, so why would people be reluctant to share their opinions? However, to the extent people said that they were reluctant to express their voting preferences on a telephone poll, we were interested in their rationale for their reluctance. As a result, we included open-ended follow-up questions to better understand the factors that drive voters to fudge their responses. Our survey was conducted with 1,000 participants on CloudResearch’s Prime Panels platform, which draws from a combination of online research panels.


What CloudResearch Found

  • 11.7% of Republicans say they would not report their true opinions about their preferred presidential candidate on telephone polls. 
  • In contrast, just 5.4% of Democrats say they’d be reluctant to share their true voting intentions — roughly half the number of Republicans reluctant to tell the truth on phone polls. 
  • 10.5% of Independents fell into the “shy voter” category, just a percentage point lower than how Republicans react to phone polls.  

After asking about people expressing their true opinions on telephone calls, we then inquired about their preferred candidate. This ordering was important as we did not want to fall into the same trap as other pollsters who tend to lead with preference declarations. When we broke the responses down based on current Trump vs Biden supporters, we found the following:

  • 10.1% of Trump supporters said they were likely to be untruthful on phone surveys — double the number of Biden supporters (5.1%) reticent to share their true intentions.

When respondents indicated that they were untruthful during polls, we followed up to confirm those responses, and then inquired as to why “shy voters” are concerned about sharing their voting intentions. Some example responses are below. These are just a representational sampling of viewpoints that we collected — sentiments not easy to gauge by responses to mere yes/no questions. 

“I don’t believe the information would be confidential and I think it’s dangerous to express an opinion outside of the current liberal viewpoint.”

“Well I probably wouldn’t give my opinion period, but if pushed, I would not give my real opinion for fear of reprisal if someone found out.”

“Because most polls released to the public are slanted and aren’t scientifically based. So, they are messing with the results of the survey from the beginning by knocking down one party or the other. I’m just trying to right the ship.”

“I am hounded day and evening by phone solicitors.  They interrupt me all the time; sometimes my irritation takes over, and I don’t answer correctly.”

“My answers could be recorded so I don’t really trust such phone conversations.”

“I do not discuss politics — let alone with a total stranger on the telephone.”

“I don’t always trust phone call surveys. I wouldn’t want to be bombarded with phone calls and political mail.”

“I don’t want my opinion associated with my phone number.”

“I am less anonymous, and somewhat ashamed of my opinion as it is frowned upon.”


In General, “Shy Voters” Cited Six Concerns: 

1. A lack of trust in phone polls as truly being anonymous. 

2. An apprehension to associate their phone numbers with recorded responses.

3. Fear that their responses will become public in some manner.

4. Fear of reprisal and related detrimental impact to their financial, social, and family lives should their political opinions become publicly known. 

5. A general dislike of phone polls. 

6. Malicious intent to mislead polls due to general distrust of media and political pundits (though a sentiment expressed only by a few “shy voters”). 

Such concerns were more often than not expressed by Republicans and Independents, and also by those who said they would vote for Donald Trump. Democrats and Biden supporters lagged both groups in terms of being concerned that their voting intentions would be shared with others. 

However, our survey findings — that show a greater reluctance on the part of Republicans, Independents and Trump voters to disclose their opinions — should not be interpreted as outright lying. Rather, such reluctance seems to stem from a  lack of trust in the anonymity of polls and a fear of the consequences if their opinions were to become public. 

The results could have implications in terms of the true accuracy of phone polls; if Republicans, Independents and supporters of Donald Trump (regardless of party affiliation) are less likely to participate in polls or accurately disclose the candidate they support, that inherently generates biased poll outcomes.

Given razor thin-margins in the swing states, such bias may have important consequences, although more research is required to fully understand the potential magnitude of this effect. We intend to examine the magnitude of “shy-voter bias” in specific states like Arizona, Florida, Michigan, North Carolina, Pennsylvania, and Wisconsin over the coming weeks. 

The results of our survey also illustrate the primary concerns that pollsters may need to address to allay voters about the level of anonymity and privacy of phone polling. We’ll also examine these issues and potential remedies in future research.

Check out coverage of this research below: 


Methodology

Sample 

Interviews were collected from two national online sample waves of 1,000 registered voters (2000 respondents in total), matched to U.S. Census data on gender, age, race, and region. Data were collected August 19 – 27, 2020 using CloudResearch’s Prime Panels, a platform that incorporates data quality checks in sample recruitment and sources participants from an aggregate of online research panels (Chandler et al., 2019). The two waves differed from each other in terms of how quotas were set. In the first sample, the full sample was matched to the U.S. Census and the sample was equally split between Republicans, Democrats and Independents. In the second wave, rather than matching the entire sample to the U.S. Census, each political party was matched to the demographic quotas appropriate for that party. The results presented above are for the first wave of participants, although the overall findings were similar across both waves. Specifically, in Wave 2, 10.5% of Trump supporters said they were likely to be untruthful on phone surveys compared to 4.6% of Biden supporters.

Survey Design 

Prior to entering the survey, all respondents were screened for attentiveness and English language comprehension. Following the screening process a responsive verification protocol was used, consisting of multiple interactive interview steps. Step 1 – Respondents were first asked one of two randomly assigned questions: Q1 – “Given today’s charged political environment, if you were asked in an automated telephone poll (also called “Interactive Voice Response” or “IVR”), would you give your true opinion about which presidential candidate you are likely to vote for?”. Q2 – “Given today’s charged political environment, if you were asked by a telephone interviewer, would you give your true opinion about which presidential candidate you are likely to vote for?”. We also wanted to compare how comfortable respondents are in responding to telephone polls, compared to online and in-person polls. We therefore also asked the same question about phone interviews, online surveys, and in-person polls, presented next to each other in the same grid. The response options to all questions were: 1. “I would definitely express my true opinion”, 2. “I would probably express my true opinion”, 3. “I would probably not express my true opinion” and 4. “I would definitely not express my true opinion”.

Step 2 – For those who indicated that they would not provide their true opinion on either question from Step 1 (i.e. responded with options 3 or 4), a follow-up question was asked: “On a previous question, you indicated that you may not express your true opinion over the phone about which presidential candidate you support. Sometimes people click on a button by accident, so we just wanted to verify your response. Are you hesitant to disclose your support for a presidential candidate on a phone survey?”. The response options to the verification questions were as follows: 1) “I most likely would not express my true opinion about which presidential candidate I support on a phone survey.” 2) “I most likely would express my true opinion about which presidential candidate I support on a phone survey.” Only people who verified their response in Step 2 were counted as being reluctant to share their opinion about which presidential candidate they would support on a phone survey. 

Step 3 – Those who verified their response in Step 2, were prompted to provide more context about their answer on an open-ended response :”Can you explain why not? Specifically, why would you most likely not express your true opinion in a phone survey?”. 

Statistical Analysis

We evaluated the association between support for a presidential candidate and reluctance to share opinions on phone surveys (phone sharing reluctance) using logistic regression-based modeling. We began by looking at a simple bivariate association between which candidate the respondents supported  and phone sharing reluctance. We found that Trump supporters are significantly more reluctant to share their opinions on phone surveys compared to Biden supporters (Trump (10.1%) vs Biden (5.4%), p. < .001.  

The association between which candidate the respondents supported and phone sharing reluctance may, however, be confounded by a number of demographic factors such as age and education. To address these potential confounds, we modeled the association between which candidate the respondents supported and phone sharing reluctance with the inclusion of multiple demographic covariates in the logistic regression model, including gender, age, race, ethnicity (as per the methodology adopted by the US Census, Latino/a ethnicity was measured separately from race and was thus entered as a separate covariate in the regression model), marital status, family composition,  education, and region. Heteroskedasticity was controlled in the regression model by using robust standard errors.  

Even after the inclusion of the covariates in the model, Trump supporters were significantly more reluctant to share their opinions on phone surveys compared to Biden supporters (p. < .001). 

Limitations

As with all studies, there are several possible limitations to the conclusions one can draw from this study as well. First, we used a nonprobability sampling frame, which means the results of this study may not generalize to the whole population. This remains true despite the fact that we replicated the main findings in this report across two samples that were recruited with quotas set to match the demographics of the U.S. population (Study 1) and quotas set to match the demographics of voters within each political party (Study 2). Additionally, it is important to remember that this is self-report data, which may not perfectly align with people’s real world behavior.

Related Articles

SUBSCRIBE TO RECEIVE UPDATES