How the Pollsters Changed Their Game After Getting the 2016 Election Wrong

votes california cast ballots 2016 election
Voters cast ballots at a polling station at the Big Bear Lake Methodist Church in Big Bear, California, November 8, 2016. Bill Wechter/AFP/Getty Images

In 2016, virtually every major pollster and election forecaster showed Hillary Clinton winning by a landslide.

Then Donald Trump became president.

The forecasting breakdown spurred several "autopsy" reports analyzing what exactly went wrong. Possible explanations included a failure to track the substantial change in voter preference in the final days of the campaign, under-polling in battleground states like Michigan or Wisconsin, and a "silent majority" that failed to reveal their support for Trump until after the election.

"We still don't know one hundred percent for sure," said Sean Trende, a senior elections analyst for the political news site and polling aggregator RealClearPolitics, when asked what happened four years ago.

But most experts point to the lack of representation of white voters without college degrees as the single biggest problem of 2016—an error that resulted in a gross underestimation of Trump's performance throughout the campaign.

Educational status only became a major polling issue around 2012. Before then, white voters with or without college degrees voted Republican at roughly the same rate. Now, college-educated white voters are more Democratic while white non-college-educated voters trend Republican. By 2016, the gap was so large that if pollsters didn't adjust their sampling method it would throw off an entire survey by several points.

"That's one thing polling has fixed," Patrick Murray, the director of the Monmouth University Polling Institute, told Newsweek. "We've done it here at Monmouth, and most other people have, but not everyone."

Lee Miringoff, the director of the Marist College Institute for Public Opinion, doesn't share the view that educational status is to blame. The institute, which was one of the most accurate pollsters of the 2016 race, hasn't changed its approach for 2020.

"Because we didn't have that problem," Miringoff told Newsweek, "we didn't go under the hood and start tinkering as other pollsters have done."

The real problem with the 2016 election, Miringoff said, was the forecasters who predicted a landslide Clinton win. Nearly every election model showed Clinton clinching the Electoral College by a wide margin. Nate Silver from FiveThirtyEight gave Trump the best chance, predicting a 71 percent chance of a Clinton win. Others, such as HuffPost, had her odds of becoming president as high as 98 percent.

"There was this perception in what was really a close race that Clinton was overwhelmingly the likely winner," he said. "Those were not polls, they were forecasting models. And they're doing it again right now."

Today, most forecasters say that Democratic nominee Joe Biden is very likely to beat Trump in the Electoral College. The Economist currently gives Biden an 86 percent chance of winning the electoral college. FiveThirtyEight found Biden winning 77 of 100 simulated election outcomes.

joe biden mike pence 9/11 memorial
Democratic presidential nominee Joe Biden (L) and U.S. Vice President Mike Pence (R) greet each other during a 9/11 memorial service at the National September 11 Memorial and Museum on September 11, 2020 in New York City. Amy Alfiky/Pool/Getty Images

Miringoff called forecaster models and polls "two very different animals." The former is predictive, while the latter offers a single snapshot of where the electorate is at a specific point in time.

"The perception is what the polls did wrong, and there were a lot of polls that were not done well," Miringoff said. "But the large share were pretty much on target."

The 2016 election, one of the most volatile contests in U.S. history, had a level of uncertainty that went beyond what polls are able to provide.

"A whole host of things happened that the polls would not have caught," said Murray from Monmouth University. "There is an overemphasis on this idea that polls are more precise than they can actually be."

Those factors included a high number of undecided voters going into the final weeks of the election, an irregularly high number of voters considering a third-party candidate, and a high number of Clinton supporters who ended up staying home.

Murray said that he's attempting to amend these wrongs by putting out polls with a range of likely voter outcomes, such as a high turnout election versus low turnout election, to better convey the level of uncertainty that often exists in polling.

He also said that so far, 2020 is nowhere near as uncertain as 2016. Pollsters expect the movable or persuadable portion of the electorate is less than half the size it was four years ago—one possible explanation for the fact that poll numbers for Biden and Trump are roughly the same today as they were several months ago.

But Trende, from RealClearPolitics, stressed that polls will never be completely accurate.

"Polls almost never hit the election right on the head," he said, noting that there was also a substantial polling error in 2012 as Barack Obama exceeded survey expectations. "Most people didn't care because polls had already predicted a win for him. In 2016 you had an error of the same magnitude, but it just happened to be in the other direction."

Pollsters "are not quite as powerful as people ascribe them to be," Miringoff told Newsweek.

"People are not sitting around waiting for the next Gallup poll or Marist poll to see what they think they are going to do in terms of voting," he said. "They make up their own minds based on the campaign, not what the polls are saying."