View from
The Center

Harris Poll’s Will Johnson: Why Polling Still Matters

(Scott Graham on Unsplash)

At the end of the day, it’s a pretty simple business that we’re in: We want to understand culture, we want to understand consumers’ thinking, and we’re just trying to answer why’s.”

On April 21st, Merion West editor-in-chief Erich Prince was joined by Will Johnson, who serves as chief executive officer of The Harris Poll. In addition to overseeing The Harris Poll, Mr. Johnson frequently contributes media commentary at publications including the Chicago Tribune, The Washington Post, and The Boston Globe. In their conversation, Mr. Johnson makes the case for the continued relevance of polling despite the reputational damage it endured following the presidential elections of 2016 and 2020, discusses preferred methodologies for accurately capturing public opinion, and brings into focus a few relevant current trends The Harris Poll has discovered with its recent surveys. This interview was lightly edited for clarity. 

The first thing I wanted to ask you was discussed also in an interview I conducted with Frank Newport, who was editor-in-chief of Gallup at the time, in 2018. The big conversation topic then, as you know, was this idea of the reputational hit that polling took after 2016. Some people, in fairness, might say “Well, what are you talking about? A lot of those election results in 2016 were within the polls’ margins of error.” But for better or worse, a lot of people started seriously doubting polling, and that happened even more so following the election of 2020, especially with, for instance, the polling of Wisconsin. As someone who is at the helm of a polling outfit, what do you make of this narrative that polling has already had its day in the sun?

It’s a great question. It has been absolutely necessary for the industry to do some soul searching, after 2016 and then this last election. Polling isn’t dead, but decision-makers and the media need to know where public opinion lies to be able to make informed decisions. You cannot really get the sentiment of consumers by just looking at behavioral data. If you look at those ones and zeros, you’d think the whole world is black and white, but we are just so gray. We are so nuanced. The only way to get that is by asking questions. For example, people are pro-Roe v. Wade, but they are also open to limiting abortions [by trimester]. They are for the First Amendment, but they’re concerned about too many illegal guns on the street. They are both for Black Lives Matter and are against defunding the police. You are not going to get that unless you find a way to speak to them and in a quantitative way.

That is survey research. That said, reform is necessary. The bias creep had occurred for several reasons in survey design, meaning how you ask the questions (or not finding the right respondents to meet) was just a function of status quo and of laziness. We need aggressive policing of fraudulent and inauthentic responses. Furthermore, we need to look at who you are contacting, how you are asking them the questions, making sure that those questions are eliminating any bias that the survey writer may have. We are proud of our last Harris poll on the election. We don’t do a lot of political polling; we don’t work with political clients, PACs, or things of that nature, but we had Biden ahead by four points. That was about as good as any national pollster had in 2020. 

So, polling is not going away; it’s still quite necessary. You’re not going to get all the answers and all the why’s by just watching people’s buy and app data. However, our industry must be vigilant about understanding how we’re getting our information and making sure our biases are checked at the door.

You have identified some possible areas of improvement in terms of methodology that polls that want to be accurate can pursue to learn from previous experience. You alluded to a couple: checking survey biases at the door and making sure that responses are valid. Can you discuss further some of the methodological efforts that Harris Poll is making to ensure as accurate of data as possible?

For sure—there is this idea of system one and system two thinking that you can go deeper into, but it has been well covered. Therefore, getting system one into our faster, automatic, intuitive emotional mode of thinking is key. How you bring that out in a survey is, I think, a really good method of featuring, for example, what a product should be talking about—or just getting the essence of a communication strategy. Tapping into that implicit thinking is important. The way you do that is by forcing consumers to make trade-offs in preferences, having them move fast through their decision processes, as they do in real life. An example is our Harris brand platform, which is our SaaS offering. We’ll show a brand, and then the respondent will be shown a list of 20 or so adjectives, and they’re asked to move very fast by clicking “Yes” or “No” whether that adjective describes the brand. That real time perceptual mapping really does elicit a more genuine answer to the question of what the respondent is feeling. 

Again, I talked a little bit about stated research in the previous response and how surveys and polling aren’t going away. That said, augmenting stated research with behavioral data—that is really where the magic happens in our industry and where Harris is putting major investment in. That’s the data science and analytics that are currently being used. When you really think about analytics in its most basic form, the idea is to take multiple statistics to bring out a better, more accurate answer to whatever you are trying to solve. That is something we are making a major priority on our custom side. It’s amazing how we’ve migrated in our solutions, from being just stated, agnostic, and not getting too much into the weeds, to augmenting all that with behavioral data from other sources and to getting to whatever our delivery is.

Moreover, leveraging the amazing power of our phones and iPhones to do actual mobile ethnographies, you can do what used to be old time focus groups: going to some stale room and sitting there, while some moderator asks you questions about something. Now, through the phone, you can get consumers to talk to you, see them, and feel them in real time. You can scale whatever you hear or feel by talking to them. Those three areas are the ways we’re trying to push forward. At the end of the day, it’s a pretty simple business that we’re in: We want to understand culture, we want to understand consumers’ thinking, and we’re just trying to answer why’s. Clients bring us questions, and we are trying to produce our best attempt to answer those questions.

Most of us are familiar with being at, say, Wells Fargo and getting a survey sent to us. However, I often hear voters say “Why aren’t they calling me?” “Why don’t these pollsters call me?” I hear this all of the time, Will. So maybe the question to you is: Who’s getting these calls?

It’s a great question. And I think that the same way, as somebody who makes a living being in this business as far as that goes; I think the number one answer is sort of the amazing power of statistics, which is it does not take a lot of respondents to be able to extrapolate out a correct decision. Now, back to the original question you asked, there needs to be a lot of assumptions into whom those key respondents are, so it’s not about sort of volume of people that the survey companies are asking and why you’re not hearing from them. Again, you’re not hearing from them because you can predict an outcome with a very high statistical probability based on measuring a very small number of respondents; we’re talking under 1,000. That’s why you’re not hearing from them. That said, it puts much more of an onus on making sure in that small population that you’re actually measuring the right people and your assumptions of who to measure are correct. And that’s where we really need to be judicious and non-judgmental and really tap into sort of our discovery mindset of what is that universe. What is that population? Who are the voters going to be? Has something changed now contextually that makes the way that we used to measure different? And if so, are we building that in? And are we checking our own biases or the old ways of doing things? 

You mentioned that the job of polling is basically to understand various social trends or aggregate preferences and the like. In preparation for this conversation, I took a look at a couple of recent polls and studies you did on topics such as social media and the metaverse and in-person vs. remote learning. Are there certain tendencies or trends that your polling has been discovering recently that you find interesting or surprising, or particularly worth exploring further?

For sure. First of all, two thirds of respondents are still not familiar with the concept of the metaverse. If you’re living in a city or really into tech, you see this stuff all the time. But we really need to step back and realize that for the general population, this is still a very new and very foreign, opaque thing. That’s number one. As you’d expect, younger respondents have a better understanding of it and think it will be a bigger part of their lives going forward. But there’s healthy skepticism from everyone about the trade-offs that come with it. I think millennials are really excited about the potential of applications for remote work: 84% there. They realize this could make their work life and their flexibility a lot better. On the flip side, they also see the potential of one company owning the metaverse and what that can do for social interaction and producing potentially negative impacts. So it’s complicated. A large percentage of our population still doesn’t really know what it is. And I think that the jury’s still out.

When we talk about in-person and remote learning, another obviously big thing that’s been on everyone’s mind, I think, number one is that this is nonpartisan; it crosses the whole political spectrum. Parents want their kids to be in school. They’ve wanted that for a long time, and they’ve also said that there’s no way a parent can be both an employee and proctoring in-school learning. That occurs at a 92% rate. It’s very rare to see 92% of respondents in agreement [about anything]. Moreover, about 58% of parents hope that some of the things that we were able to do from a technological standpoint are leveraged going forward to make education better and create a more connected environment. I can go into the pandemic, if you want. I can go into inflation.

I think inflation would be interesting.

Inflation is pretty interesting in that 84% of Americans are saying that they’re going to cut back on spending because of price increases. It’s a little different right now as far as the impact people are feeling because we are feeling crazy levels of inflation. But, at the same time, the job market is so strong. So people have jobs; the leverage is on the side of employees. So we’re seeing them make rational decisions given the inflation, which is cutting back, but we’re not seeing the same level of fear and dread that you’d see if the labor market weren’t as strong as it is.

We asked respondents whether they’re going to use inflation as an excuse to forgo going to a social event. Two thirds of Americans have said they’ve done so at least once. I think that’s a question I’d want to look deeper into. I think that says less about inflation and more about [COVID-19]; it’s almost like a new excuse. We’re seeing people having a hard time readjusting and reengaging in in-person social connectivity. I found that to be a fascinating stat.

A question that comes to mind for me is that these days there seems to be a lot of different proposals being made about how we should gauge public opinion. Some want to look at Twitter keywords or various usage habits and data based on web traffic and such. Can you make the case for polling’s continued durability as either a complement to those types of measurements or something that is actually more valuable than those other types of insights being proposed?

Certainly. It is all about getting at the “why.” It’s very difficult to just follow or look at online data or at behavioral data in order to understand why the consumer is making the behavioral choice they are making without talking to them. And as much as we would like to dumb this down to just having an algorithm about getting in people’s heads—this is very complicated, and you really need to talk to them. And that occurs through polling, whether that is the qualitative level with an ethnography or focus group or quantitatively through surveys.

One of the things that I think is very interesting about polls is that we often see a top line report in the news media, but there’s also a lot of data and information which, as consumers, when we’re reading The Hill or Politico news story about the poll results, there’s not as much engagement with all of the various questions these polls, which are often thick with information.

Totally. And back again to the original question of the interview, I think that there’s an obligation on pollsters to make sure that we’re asking the questions correctly and designing the surveys in a way that’s going to elicit the truest responses. But on the editorial side, cherry picking data or over simplifying it is also an issue. And I understand that we have short attention spans, but in order to really understand an issue, it’s exactly what you said: You have to look beyond that top level number. And so I would advise anyone who’s consuming polling data to always take the top line, whether that be a [presidential] job approval rating or how people feel about a particular big issue, and go look underneath. What are the drivers that are going into that metric? And then look at the composition: Who exactly were they measuring? And I hope that the editorial side will spend more time teasing out the nuances to make sure people really understand what they’re reading in that number.

You mentioned before that somebody, for instance, might favor both Black Lives Matter and ample funding for policing. But a newspaper headline might want to just draw out one of those two points and stick it into this left vs. right reporting binary without engaging with the various preferences reported on a whole number of issues that might not fit as neatly into that split.

Exactly, and that’s where, unfortunately, some of the business models, as they relate to media and getting people to click on things, favor not telling the full, more nuanced story but, rather, feed into a narrative that is not true and feels far more polarized. The biggest takeaway I have from being in this position for the last five years and seeing the millions of data points is that, as a society, we are far more nuanced and closer together on a lot of issues than you would ever think. People are smart. People are complicated. That is why we must have the discipline and take the time to really understand what exactly they are saying here. It is not as simple as a headline.

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.