top of page

Don't Be Misled: Why Understanding the Nuances in Polling Is Crucial in the Israel-Hamas Conflict

Over the las



This week, we’ve seen a flurry of public opinion polls in America about the Israel-Hamas conflict. It can be easy to draw quick conclusions – or even have knee-jerk emotional reactions – from what gets presented. But this can be very dangerous if the responses that we are reacting to are not nearly as straightforward as they may seem. There's a lot of nuance and complexity hidden behind those numbers.

In this article, I'll break down the key things you need to look out for when reading polls. By the end of this article, you'll have a toolkit for making sense of polls and cutting through the noise. You'll be able to spot potential problems and interpret the results in a more meaningful way. So, let's get started and learn how to be smarter consumers of public opinion data.

Who’s being polled?

As you may recall, the strategy I’ve been promoting since October 7 is to focus on maintaining the support of global leaders who, from October 7, stood by Israel’s side (primarily, President Biden). This is because we need their help to continue fighting our war with Hamas, to help negotiate to free the hostages, and to defend us at the U.N. and other global platforms where diplomatic pressure can be exerted.

Considering the fact that President Biden is running for re-election this year, it is a reasonable bet that he’ll be looking closely at what his potential voters are thinking about how he’s handling the Israel-Hamas war. As such, if we want to know what may be influencing Biden’s stance vis-à-vis the conflict, it’s crucial that the respondents of the poll you’re reviewing are registered voters not just the general population, who may or may not be registered voters. A poll that samples the “general population” over the age of 18 may not accurately reflect the opinions of the electorate and could result in misleading conclusions about how pressured President Biden may be by them.

So, next time you look at a poll, check if all the respondents were registered voters. If not, take it with a grain of salt and not as more than an indicator of a trend of public sentiment.

The sample size

A reliable sample size for a public opinion poll in the United States depends on several factors, such as the margin of error, confidence level, and the size of the population being studied. As a rule of thumb, for a population as large as approximately 330 million people, a sample size of around 1,000 to 2,000 respondents is generally considered reliable for most purposes.

If you want to get really nitty-gritty, for a national poll with a margin of error of ±3% at a 95% confidence level, a sample size of 1,067 is typically sufficient. This means that if the poll were conducted multiple times using the same methodology, 95% of the time, the results would be within 3 percentage points of the true population value.

For smaller populations or specific subgroups (e.g., a particular state or demographic), smaller sample sizes may be sufficient, but beware of how reliable the results are. For subgroups with a sample size of 100, the margin of error is approximately ±10% at a 95% confidence level. This means that the results for these subgroups are less precise than the overall results. Take for example, the latest Gallup poll on opinions about the Israeli-Palestinian conflict surveyed 1,016 Americans (general public), out of whom only 126 were aged 18-35. That’s not a lot of people to draw any reliable conclusions from for the entire generation in America. But, we should certainly take note of the trend…



The bottom line is this: when evaluating the reliability of a poll, always consider the sample size (also of the subgroups) to ensure that it’s statistically reliable. This is especially true if you’re presented with qualitative data – i.e., insights that surface from focus groups (groups of usually up to 8-10 people at most), or one-on-one interviews. These are never more than an indication of sentiment and must be backed by larger, quantitative surveys, in order to be relied upon.

How the questions are worded

The way questions are phrased can influence responses. Look for neutral, unbiased language that does not lead respondents towards a particular answer. Be cautious of loaded or double-barreled questions that ask about multiple issues simultaneously.

Here’s an astonishing example of how a reputable polling company, YouGov, seems to have completely missed the mark in the way the question was worded. The results should not surprise you, nor should the announcement by the pro-Palestinian queer group, Code Pink, who gladly shared the data.




We can’t blame Code Pink fully, as the same sentiment was expressed in this – rather leading – press release about the poll that was commissioned by the Center for Economic and Policy Research, with the catchy title: “Poll: Majority of Americans Say Biden Should Halt Weapons Shipments to Israel”. Notice the language that leads us to conclude a major change in American public sentiment?

Another mistake often made is assuming that similar questions from different polls can be compared. Take, for instance, the question of support for either side of the Israel-Palestinian conflict. We know for a fact that sympathy for the Israelis in America has always been significantly higher than for the Palestinians (although you might find that hard to believe), as can be seen in these Gallup and YouGov longitudinal studies:





So, yes, support for Israel has been higher, but never over 62%. Yet, when the latest Harvard Caps Harris poll was published this week, it was widely celebrated amongst the pro-Israel community because of the main finding that a full 82% of Americans support Israel!




The immediate conclusion based on these results was that there is deep and overwhelming support amongst Americans for Israel’s continued ground incursion. While I would love this to be true, I must caution that it’s a little misleading, because of the way the question in this survey was framed, as compared to most other surveys.

In the Harvard Caps Harris poll, respondents were asked if they supported Israel or Hamas – whereas all the other polls ask about sympathy for the Israelis and the Palestinians (not Hamas). One asks about support, the other about sympathy. One about political leadership (Israel vs. Hamas), the other about people (Israelis vs. the Palestinans). Note also that most other polls offer more than just two choices as a responses, such as “both” and “don’t know”, which provide more nuanced responses (and that's why you see lower percentages amongst those who outright support one side or the other in these other polls). 

So, what I’m saying is, don’t compare apples with oranges. Always compare surveys to previous surveys by the same company, and only directly compare questions that are framed in exactly the same terms as the last time they were asked by the same polling company.

Context and timing

Public opinion can shift rapidly in response to current events. Consider the context in which the poll was conducted and whether any significant events may have influenced responses.


For example, a massive public perceptions poll, the Global Soft Power Index 2024, measures perceptions of over 170,000 people towards almost all countries around the world. The survey was fielded between September and November, 2023. In other words, including after October 7. We are therefore not fully surprised that Israel dropped in the overall ranking from the year before (from 27th place to 32nd), as not only was 2023 a pretty challenging year for Israel regardless, from mid-October it was appearing in the news only in one context – the Israel-Hamas war. No wonder we dropped in the ranking so dramatically, right?




Subgroups

Top-level results may obscure important differences among subgroups. Look for breakdowns by demographics such as age, gender, race, education level, and geographic location. This can provide insights into how opinion varies across different segments of the population.

We’re always checking how Gen Z is responding to Israel’s actions now, and it’s important to pay attention to as the differences between the generations is vast. Another subgroup worth reviewing are Democrats vs. Republicans. Data just about Democrats are important because we know that the Biden administration, vying for another term, is looking specifically at them. But Independents are also important, and even more so if the poll includes swing states, which is where we know the elections are going to be decided.

But don’t forget to pay attention to the sample size of the subgroups, they may be too small to draw any actionable conclusions from.

Press releases and reporting

Finally, be cautious of relying solely on press releases or media reports about polls. These often focus on top-level findings and may oversimplify nuanced results. Whenever possible, examine the full report or datasets to gain a more comprehensive understanding.

We already mentioned the way the YouGov poll commissioned by the CEPR was presented in a highly misleading press release. Here’s another fascinating case study on the power of the announcement itself to shape readers’ perceptions:


The most recent Wall Street Journal poll, published on March 3, sought to grab attention, with the title of the article claiming: “U.S. Voter Sympathy for Palestinians Grows as Israel War Drags On”, and the leading paragraph suggesting that “a plurality of American voters believe Israel has gone too far in its response to the October attacks by Hamas.” Worrying results indeed, and worthy of our attention.

But – and it’s an important “but” – a closer examination of the data reveals a far more complex and nuanced picture that warrants a deeper analysis of the poll results in their “rawer” form. The WSJ data reveals that while 42% of Americans (that “plurality of voters” mentioned above in their press release) believe Israel has gone too far in its response to the Hamas attacks, an equal percentage (43%) hold a different view: 24% of Americans think Israel’s response has been about right, while 19% believe it has not gone far enough. This combined 43% is a significant counterpoint to the “gone too far” perspective, highlighting the importance of considering the full range of opinions when interpreting poll results.





The poll also sheds light on American perceptions of U.S. support for both Israelis and Palestinians. Interestingly, the data shows little difference in these perceptions, contrary to the way it was presented in the press release: 57% of Americans say that the U.S. is doing too little or about right for the Israeli people, while 58% express the same sentiment regarding U.S. support for the Palestinian people. This finding challenges the article's emphasis on the growing share of Americans who believe the U.S. isn't doing enough to help the Palestinian people, as it overlooks the fact that most Americans hold a similar view about U.S. support for Israelis. 

Yes, the Palestinians have more sympathy on their side, which is hardly surprising under the circumstances. But to then conclude that Americans have very little sympathy for – or a desire to support Israel – is simply wrong. Easily concluded from how the information is presented, yet still wrong. 

To conclude

The Wall Street Journal poll and other recent surveys offer valuable insights into the complexities of American public opinion on the Israeli-Hamas conflict. But we have to think for ourselves when we are consuming such data. When we choose to not take headlines and “clickbait” opening paragraphs at face value, and consider the nuances in the text and visual graphics, we can develop a better understanding of how Americans really feel about the situation.


In summary, always consider:


  • Whether the respondents are registered voters or not – surveys of registered and likely voters are more indicative for us than those of the “general public”;

  • The sample size – should be at least 1,000 for a national poll, and when looking at subgroups, if the numbers are very small, just don’t jump to any major conclusions from the results;

  • The way the questions are worded – steer clear of questions that are “leading the witness”, they’re not reliable enough, or worse;

  • When the survey questions were fielded – be sure to be cognisant of any major event that happened before, during or after that time frame, and read into the results accordingly;

  • Subgroups – pay attention to the different demographics and other subgroups to decipher more detailed results, but remember the sample size disclaimer; and

  • Beware of catchy titles and summary press releases – don’t just read the press release, if you can be bothered you must look at the “topline results” which are always linked to the high level findings. That’s where the real juice is!

I know I may be asking you for too much, but this data is so important for us to be able to make wise choices on how to handle global public opinion about Israel.


If you can’t be bothered to do the legwork yourself, you are welcome to check out The Perception page of the Hasbara Toolbox, where I share my analyses of polls as they are released. The more informed we are, the more effective advocates we’ll be!

bottom of page