Column: Opinion polls are usually right. Here you can read what you should pay attention to

(Genaro Molina/Los Angeles Times)

Column: Opinion polls are usually right. Here you can read what you should pay attention to

Elections 2024, California politics

David Lauter

March 15, 2024

You can’t understand politics without polls; you can misunderstand a lot with them.

With both major parties


nominations sewn up, were deep in the season when worrying about polls can become an obsession. That’s especially true this year, as former President Trump holds a small but persistent lead over President Biden in most national and swing-state surveys.

That has led many Democrats to dig deep into the polls in an often self-deceptive search for errors.

The fact is that polls still accurately reflect election results the vast majority of the time. They are also an indispensable tool for democracy in informing residents of a vast and diverse nation about what their fellow Americans believe.

At the same time, errors exist, which often involve difficulties in collecting data or difficulties in interpreting them.

Let’s take a look at a few examples this week and see how the LA Times polls have done this primary season.

A Holocaust myth?

In December, The Economist published a surprising poll result: one in five young Americans thinks the Holocaust is a myth, according to the headline.

Fortunately for the country, but perhaps not for the publication, it is the outcome of the survey that may have been mythical.

In January, the nonpartisan Pew Research Center set out to see if it could replicate the finding. They couldn’t. Pew asked the same question asked in the Economist survey and found that the share of Americans between the ages of 18 and 29 who said the Holocaust was a myth was not 20%, but 3%.

What is happening?

The problem isn’t a bad pollster: YouGov, which does the surveys for The Economist, is one of the countries

the highest

considered polling stations. But the methodology YouGov uses, known in the polling world as opt-in panels, can fall victim to fake respondents. That may have been the case here.

Panel surveys are a way to solve a major problem pollsters face: today, very few people will answer phone calls from unknown numbers, making traditional telephone surveys extremely difficult to conduct and very expensive.

Instead of randomly calling phone numbers, polling organizations can recruit thousands of people willing to take surveys, usually in exchange for a small payment. For each survey, the pollsters select people from the panel to create a sample that is representative of the total population.

Some people only participate for the money, but then can move through it quickly and answer questions more or less randomly. Previous Pew research has found that such fake respondents typically claim to belong to groups that are difficult to recruit, including young people and Latino voters.

Pollsters have found evidence of organized efforts to infiltrate panels, sometimes with “multiple registrations of people located outside the US,” says Douglas Rivers, the




scientist at YouGov and professor of political science at Stanford, wrote in an email. These may be efforts to support certain causes or candidates, or, more often, schemes to make money by raising small amounts over and over again.

“We have a suite of procedures in place to exclude these panelists,” Rivers wrote, adding that the company continued to analyze what was happening on the Holocaust issue.

In polls in close elections, fake respondents who answer randomly will usually “kind of cancel each other out,” says Andrew Mercer, senior research methodologist at Pew.

“But for something that is very rare, like Holocaust denial,” random responses will produce errors that are all on one side. “It will ultimately drive up the incidence,” he said.

For example, in previous research, Pew found that 12% of respondents in opt-in survey panels who said they were under 30 also claimed they were licensed to operate a nuclear submarine.

The lesson here is an old one, popularized by the late astronomer Carl Sagan: “Extraordinary claims require extraordinary evidence.” If a poll seems just too true, chances are it isn’t.

Drawing conclusions

A second category of potential problems has less to do with the data than with the way people, especially us journalists, interpret it and draw definitive conclusions from less-than-definitive numbers.

Consider how much progress Republicans are making among black and Latino voters.

As I’ve written before, there’s no doubt that Republicans gained ground between 2016 and 2020, especially among Latino voters who already identified as conservative. There was also a smaller movement toward the Republican Party among black voters.

Has that trend continued? Some recent research, including the much-cited New York Times/Siena College poll, suggests this may have accelerated. The poll shows Biden has lost support among younger black and Latino voters.

In a recent article that attracted a lot of attention, John Burn-Murdoch, the Financial Times’ chief data journalist, combined data from different types of polls to declare that “American politics is in the midst of a racial realignment.”

In fact, the response from many political scientists and other analysts was, “Not so fast.”

Pre-election surveys can tell you what potential voters are thinking today, but it’s important to compare them to past election results, they noted.

If the actual results in 2024 match what the New York Times/Siena polls currently find, “fine, let’s talk about racial realignment,” said John Sides, a professor of political science at Vanderbilt University. But until then, “we’ll have to wait and see.”

How we did it

Our UC Berkeley Institute of Governmental Studies/Los Angeles Times polls had a remarkably good year for predicting elections


For example, the latest poll earlier this year showed that Proposition 1, the $6.4 billion government-backed mental health measure, is a bailout.


Newsom received the support of 50% of likely voters.

On Thursday morning, that was almost exactly the point where the “yes” vote stood at 50.2%, with nearly 90% of the states’ votes having been counted.

The poll also correctly predicted that Democratic Rep. Adam B. Schiff of Burbank and Republican former Dodgers player Steve Garvey would be the top two finishers in the Senate primary, with Democratic Rep. Katie Porter of Irvine in third.

The survey, conducted about a week before the election, found that 9% of voters were still undecided. Among those undecided, Garvey had 30% of the vote, Schiff had 27% and Porter had 21%, the poll showed.

The poll appears to be very close to Garvey’s number, with about 800,000 votes left to count. He has 32%, well within the polls’ estimated margin of error of 2 percentage points in either direction. The survey slightly underestimated support for Schiff, who also has 32%, and overestimated support for Porter, who currently sits at 15%. That could mean that the last group of undecided voters broke for Schiff.

That level of accuracy is not unusual. For example, during the 2022 midterm elections, polls from nonpartisan groups, universities, and media organizations were highly accurate.

For people interested in politics, especially in a hotly contested election year, there’s a takeaway from all of this: Don’t focus too much on an individual poll, especially if it includes an initial finding that hasn’t turned up anywhere else. Be skeptical of sweeping conclusions about events that are still unfolding. And even, or perhaps especially, when a poll shows who your favorite candidate is

lagging behind


to lose,

take it for what it is, neither an oracle nor a nefarious plot, but a snapshot.


Please enter your comment!
Please enter your name here


Hot Topics

Related Articles