By using this site, you agree to our Privacy Policy and our Terms of Use. Close

Forums - Politics Discussion - 2016 vs. 2020 RCP, Final Results, and Enthusiasm Polls

 

Do you enjoy this kind of analysis?

Yes 2 66.67%
 
No 1 33.33%
 
I don't enjoy anything political. 0 0%
 
Total:3

Preface: First, let me just get this out of the way. I am a federal libertarian and a state social conservative. This means I have social conservative leanings, but I prefer the vast majority of those to be enacted at a state level. On a federal level, I prefer the government to stay out of most issues and let states do what they want (slavery and abortion being some exceptions). I am trying to look purely at data, without bias, to see how Trump Vs. Biden (2020) is looking in the states that matter compared to Trump vs. Clinton (2016). If there is ANY error in the data I have, it is not intentional, nor is it meant to sway readers to a certain candidate. If you find some data incorrect, please let me know and I will adjust if you can show me the source. I want this data to be as accurate as possible, so I encourage this! Ideally, if this is popular, I will try to do these for major elections going forward. I started this data collection months ago and was a knucklehead for not linking the sources to each piece of data on my spreadsheet. If someone wants to challenge something, I will attempt to rediscover where I found those polls (enthusiasm polls, specifically, are difficult to find). Most of them are from RCP (Real Clear Politics) and average out polls from all over the nation every day.

What the data is: The data will show a few different things. Everything except for the enthusiasm polls will be based on the battleground states of Pennsylvania, Michigan, Wisconsin, Florida, North Carolina, and Arizona. Ohio, Minnesota, and Nevada are not included (even though recent polls have indicated that they are becoming more of a tossup) as they were not battleground states in 2016. The data will show 240 days out from the election, then 210 days, and continue every 30 days until 30 days out from the election.

C = Clinton

B = Biden

T = Trump

RCP 2016 vs. RCP 2020: Clinton vs. Trump and Biden vs. Trump polls

RCP 2016

RCP 2020

240

C+4.6

B+2.6

210

C+6.2

B+2.2

180

C+6.7

B+3.6

150

C+4.3

B+3.5

120

C+3.7

B+5.3

90

C+4.3

B+5

60

C+2.8

B+3.1

30

C+4.4

B+3.8

Avg.

C+4.625

B+3.64

Difference between

Clinton up .985%

 

Based on this data, Clinton almost has a 1% outperformance of Biden in these battleground states. But how did the actual election go?

Actual Results RCP 2016 vs. RCP 2020

Actual Results in millions:

Donald

Hillary

Arizona

1.252

1.161

Florida

4.618

4.504

North Carolina

2.363

2.189

Wisconsin

1.405

1.383

Michigan

2.28

2.269

Pennsylvania

2.971

2.926

Average

14.889

14.432

Total Votes cast for both candidates in these states

29.321

Winner Margin

Donald +1.56%

Difference compared to battleground polls:

Trump performed 6.185% better than polls indicated he would in 2016 for these states.

 

If polls are similarly accurate this year compared to 2016, then Trump could perform around 7.17% (6.185% + .985% worse for Biden) (Clinton polling +4.625% on average before the election and losing by 1.56%) better in these states than polls have been indicating that he will. Currently, Biden is leading polls in those states by 3.64%, so if the results were the same gap as 2016 then he would lose those states by 2.545%, or .985% more than Clinton lost them in 2016.

Voter Enthusiasm 2016 vs. 2020

Dec. 2015

R+13

January

February

March

T+11

April

T+29

May

June

T+37

July

T+28

August

T+17

September

T+31

October

November

T+13

 

Enthusiasm Average: Trump up 12.33% over Clinton in 2016. Trump up 28.4% over Biden in 2020. Trump has a 16.07% higher enthusiasm gap over Biden (2020) compared to what he had over Clinton (2016). Enthusiasm influences not only people showing up to vote, but bothering to look into absentee ballot laws and registration, making sure their votes are notarized, following up to see if their votes were received, and talking others into voting for their candidate of choice.

Please share your thoughts, but keep it focused on data. The point of this thread isn’t meant for political arguments, but rather to have some interesting data to look at and discuss. I hope I’m not the only one who likes to look at this stuff lol.

 

Last edited by Dulfite - on 04 October 2020

Around the Network

I'm at a bit of a loss to understand why this needs a separate topic from the 2020 election topic, and you'll probably have to justify that if this is to stay open.

There's also issues beyond that. I'm really unclear of what data is being presented or why. 

For instance, in the first table, I'm not sure why it makes sense to give an average of the results. To some extent I get why you don't want to take a snapshot of one data point, but I can't see any point in weighing the results from 240 days ago as heavily as the results 30 days ago. Results tend to get more accurate as the election gets closer, and current events have changes a lot since 240 days ago. I have no idea why an average would be a valuable metric. Real Clear actually jettisons polls from its averages after a certain amount of time, and fivethirtyeight diminishes them over time. Including 8 month old data in your analysis doesn't make statistical sense. 

The second chart... I don't really know what I'm looking at. It seems you're giving the results in actual voters and comparing them to percentages? That doesn't work... If we're interested in percentages, just use percentages all around. How is the average 14 when none of the numbers are above 3? I'm lost.

Compared to 30 days out, Clinton was overrated by about 4% in PA, 1% in Arizona, 7.2% in Wisconsin, 3.9% in Michigan, by 2.8% in North Carolina, and by .8% in Florida. I can't see how you got to 6%. 

The only maybe explanation is that your using the 8 month old data as heavily as the current data which again you shouldn't be doing. And there's a huge problem with that, because we don't know how accurate or inaccurate the 240 day out data was. It is quite possible that the data 240 days out was a 100% accurate prediction of what would have happened if the election had been held that day. As more news came out and more people decided, things changed. Comparing an average lead over 8 months to what happened on election day really makes little sense to me. 

And I'm not sure why averaging this would be appropriate in any event. The polls were not equally "off" across the battleground states in 2016. They were actually mostly accurate, aside from the rust belt states, Wisconsin, Pennsylvania, and Michigan in particular. You're basically taking how off the results were in those states and sharing the love. If you took an average of all 50 states that maybe would be viable, but you're essentially applying an error in three states to every other state, and I can't see why. I don't know why we should expect Florida for instance to be off by 6 points in favor of Biden when it was only off by 1% in favor of Hillary. In Texas, Hillary was underrated by 3 points... should we expect 2020 polls to be off 6 points in Favor of Biden? If not, why should we do that in Florida? 

The enthusiasm numbers are mostly unsourced, and the one you did source is misrepresented. That study specifically focused on one age group. It also was enthusiasm for the candidate, not for voting in general. People can be enthusiastic about voting against someone. Furthermore you only have one data point, which is not enough to draw a conclusion about the overall enthusiasm. If you're just taking one survey from each month, and not even surveys that represent the whole electorate... that's a dealbreaker ladies. I get that enthusiasm data is hard to come by, but if we have insufficient data, we have insufficient data, we don't just extrapolate from it.

Soooooooo... there are some serious problems that need to be either explained or corrected for this to be worth keeping open, and even then there needs to be justification for this as a separate topic when there is an official topic this could fit into.



It is also important to note that you are not comparing Apples to Apples when talking about 2016 polls. Due to discrepancies in results, many polls updated their methodologies. Most notably, polls have started factoring in demographic information relating to education level for polls as education level has emerged as a strong predictor of who you vote for. This change results in a more accurate sampling of Republican voters which should shrink error margins. Many polls also began modeling for different turnout levels, giving a range of results. 

Another difference is the poll density. Take for example Wisconsin, which had some of the widest margins between expected and actual results in 2016. In August, there were only 3 polls in 2016, compared to 9 in 2020. In September, there were only 2 polls in 2016, compared to 15 in 2020. More data means that we can be much more sure that what we are seeing reflects reality.

Both of these changes should greatly reduce margins of error, so while the results we are seeing this year may not be as heavily Dem leaning, it is likely that they more accurately reflect reality. It is also important to remember that Comey announced that Clinton was under investigation on October 28th, which was a huge October surprise that came so close to the election that its full effect likely wasn't ever seen in polls. 

As a bit of a sidenote, I don't really like RCP. There are wide disparities in poll quality and poll lean which aren't factored in, which often results in pretty wide swings in RCP's Averages. One outlier poll can swing RCP averages by several points in one day, which isn't ideal. If you look at Pennsylvania in 2016 for example, Clinton had a roughly 3 point lead on October 3, but a 9 point lead on October 8. The race did not change that much in several days, but because of how RCP handles their average, you regularly see these wild swings. As such, comparing a single point in RCP data doesn't really tell you very much.