Number 484 | July 26, 2011 |
This Week: Polls as Propaganda, Volume 1 |
Greetings, This week's Notes is Part One of what I expect to be a two-part series on the Propaganda effect of opinion polls. I'll be on vacation next week, so I'll try to get the second installment to you before I go to our Lake Superior island campsite. Nygaard
|
Here's some information that should be at the very center of our current debates about the federal budget, as it is more evidence of the national emergency that is facing communities of color in the United States. It's from a report by the Pew Research Center, released on July 26th 2011, entitled, "Wealth Gaps Rise to Record Highs Between Whites, Blacks, Hispanics; Twenty-to-One." "The median wealth of white households is 20 times that of black households and 18 times that of Hispanic households, according to a Pew Research Center analysis of newly available government data from 2009. "These lopsided wealth ratios are the largest since the government began publishing such data a quarter century ago and roughly twice the size of the ratios that had prevailed between these three groups for the two decades prior to the Great Recession that ended in 2009. "...the typical black household had just $5,677 in wealth (assets minus debts) in 2009; the typical Hispanic household had $6,325 in wealth; and the typical white household had $113,149." If anyone saw this news on the front page of your local paper, please let me know. The report can be found online HERE. |
A couple of issues back I spoke about public opinion polls, saying that I don't think very highly of them. Before long I heard from alert reader Ken, who emailed me to say, in part: "What's your basis for jumping from a very small sample of polls to the broad conclusion, 'I believe public opinion surveys, as we know them, are a seriously flawed means of gauging public opinion'?" [That is, indeed, what I said.] He went on, "Lacking any other information from you, I read your belief as being about most or all such surveys. Do I misunderstand your position? How do you support this very sweeping statement? (My questions stem from having learned about polling many years ago in statistics classes.)" Looking at it through his eyes, I can see that this might have seemed like a hasty, off-hand comment. But long-time readers are aware, I hope, that I rarely make any comment without thinking long and hard about it. This is no exception, and now I'll take a moment to explain some of my thinking. Why do we have opinion polls in the first place? I think the point of polls should—or could—be to bring into focus the thinking, giving shape to the needs and priorities of the population. In a truly democratic process, representatives would arise from this process and shape the nation's structures and institutions in ways that best fit with what the population wants. Right now polls are used in just about the opposite way. Polls are used by politicians to figure out what the population wants to hear, from which is crafted a "message" that will attract sufficient votes to put or keep the politician in power. It's upside down. But that's just how they are used. My point in this series is to talk about how they affect our thinking. That is, how they contribute to the process of Propaganda. In NN #483 I talked about how Deep Propaganda, regardless of the intentions of individual reporters, embeds itself into news reports. The same dynamic is at work when it comes to opinion polls. Certain common premises and assumptions live in the minds of pollsters, and they inevitably shape the questions that are asked and the answers that are considered. The entire process has the effect of simplifying a very complex world, and of steering respondents toward certain answers and away from others. Perhaps most importantly, the process has the unfortunate effect of reinforcing a certain way of thinking about the world that limits and constrains our understanding. For the record, I have little problem with the methods used to conduct polls. I know about such often-talked-about polling issues as response bias, non-response bias, and coverage bias, and I know that they are accounted for, to some degree, by professional poll-takers. I also know about "margin of error" in polls and, while it's typically misunderstood, I'm sure that pollsters understand it and speak honestly about it. So, for now, I'm not talking about the methods, or even the intentions, of pollsters. I'm talking about the effects, and how they end up performing a Propaganda function. So, with that in mind, let us begin. |
It was back in 2001 when I first spoke in these pages about one of my problems with polls, which grew out of my very brief but formative experience as a poll-taker for the Gallup organization many years ago. (You can read those comments in Nygaard Notes #132 November 16, 2001 in my essay called "Don't Know/No Opinion.") I talked then about how polls essentially force people to simplify their otherwise complex views. The example I used was the famous "Do you approve or disapprove of the way the President is handling his job?" I noted, in my role as a Gallup pollster, that "my experience was that many people would hesitate and ask for a chance to inject some complexity into the simple either/or equation. For example, they might say, 'Well, I like what he's doing on so-and-so, but I can't understand why he is doing such-and-such' Sorry! No room for that sort of wishy-washy answer! How I wished I could offer not two, but three options: 'I approve,' 'I disapprove,' and 'It's not that simple, you knucklehead!' Rather than see themselves consigned to the netherworld of 'Don't Know/No Opinion,' most people would reluctantly place themselves into one of the 'pro' or 'con' camps. ('OK, I guess I approve...') And thus are the shifting sands of public approval molded into the concrete reality of headlines." In that selection you just read I was talking about one large problem with polls: Oversimplification. That problem almost inevitably results from a polling process that tends to 1) ask "either/or" questions and/or 2) have people pick a single response from a list. While some people certainly do not know, or perhaps really do have "no opinion," I think it's often the case that they will not or cannot choose a good answer from among those offered to them. So we get responses that are not really indicative of people's best thinking. Or any thinking. Steering Related to Oversimplification is the problem that I call Steering. This is when respondents are asked to choose from a list of prepared answers. I imagine this is done so that the results can be presented in a simplified way. That is, if a survey were to ask an open-ended question of 1,000 people, there is a chance that they would end up with 1,000 different answers. (The average national survey does, in fact, ask 1,000 people, more or less; sometimes 1,500.) How would that result be reported? Consider a recent CBS News Poll that asked 1,024 people "What do you think is the most important problem facing this country today?" They offered a list: "Economy/Jobs; Budget deficit/National debt; Health care; Education; War/Iraq/Afghanistan; Other; Unsure." The most interesting thing about this poll, to me, is that 28 percent said "Other." Of course, they didn't say "Other," they presumably said some specific thing, something not on the list. Only 3 percent said "War." And what did they mean by "War"? That we're not winning? That the current wars cost too much? That they don't support these specific wars? Or maybe they do support them, and it's an issue for them because so many people do not support them. So, what does this poll actually tell us? Not much. Some of the respondents may well have seen this list as something other than what it was meant to be. Instead of a list of "problems," they may have seen it as a list of symptoms. That is, some people may be like me, and see our problems with the "economy" as symptoms of a grossly unequal and consumerist culture (see this week's "Quote" of the Week). But that's not on the list! In fact, when I think of the top problems that I would like to see addressed by leadership at every level, I come up with a totally different list. Yet, were I to choose to respond to this survey, I would have to allow myself to be steered toward one of the pre-selected choices, or place myself in the "Other" group. My reluctance to be steered is one of the things that leads me to decline to participate in surveys. Oversimplification and Steering are two obvious problems with polls as we know them. But an even bigger problem is that polls are designed using a certain way of thinking, and the process that results encourages respondents to think in that same way. This bias towards certain ways of thinking almost inevitably produces Propaganda, as we'll see in the next article. |
I have mentioned that many people's choices do not typically appear on the standard lists of "most important problems" that are offered by the major polls. Recall the list we just looked at: "Economy/Jobs; Budget deficit/National debt; Health care; Education; War/Iraq/Afghanistan; Other; Unsure." While it's true that I can't make a good choice from such a list, I hesitate to join in the amorphous blob of "Other." What other "others" might reside there? Many people like me, perhaps. Or maybe there are many conspiracy theorists, paranoids, xenophobes, or who-knows-who. And it is certainly not true that I have "no opinion"—I have strong opinions, it's just that they're not usually on the list. This is no doubt true for many people. Thinking a little more deeply, I realize that the problem is not really what is on, or not on, the list being offered. The problem is that the production of a coherent response to such a question requires a certain way of thinking that is different from my own. Sticking with the current example, the question is: "What do you think is the most important problem facing this country today?" What sort of thinking would be required in order to answer this? I think one would first of all have to see "issues" as being separate and distinct, and secondly, that they can be ranked in order of "importance." This reflects the dominant mode of thinking in U.S. culture, but it's not my mode of thinking. Such thinking gets us in all kinds of trouble, so let's have a look at it. Analytical Thinking vs Systems Thinking The intellectual tradition in the United States is based on an analytical mode of thinking, which relies for understanding on looking ever-more closely at ever-smaller parts of things. The late Russell Ackoff, organizational theorist and long-time professor at the Wharton School at the University of Pennsylvania, goes so far as to say that "Our entire culture is built on analytical thinking." In this culture we scrutinize, we specialize, we take things apart to see how they work. Then, once we think we understand the parts, we put them back together and see what we have. This supposedly results in "understanding." The three-step process used to arrive at such understanding looks like this: 1. Take the thing apart, break it into its various parts; I'm calling this mode of thinking "analytical," but it is sometimes referred to as scientific. Whatever we call it, it certainly has its uses, but it's not the only way to think. In fact, there's an entirely different way of thinking that is almost the opposite of the analytical mode. It's called Systems thinking. Since Systems Thinking is so important to the world inhabited by Nygaard Notes, I'll offer a few ideas that are central to the... well, the system. This is based largely on the analysis of Russell Ackoff, to whom I referred a few paragraphs ago. * A system is a whole which is defined by its function in a larger system of which it is a part
1. Remember that the thing you're trying to understand is a part of something bigger; OK, fine, but what does all this have to do with polls, or media, or anything I'm talking about in this series? Well... I have often spoken of the media phenomenon of getting the facts right, but getting the story wrong. This is usually the result of failing to look at, or to understand, the larger system that gives meaning to the facts being reported. An over-reliance on analytical thinking sometimes results in knowledge, but rarely understanding. Sometimes it doesn't even result in knowledge—at least, not useful knowledge—because we end up examining the right facts but explaining them using the wrong story. Next week I'll try to imagine something I'll call Systems Polling, and how it might help us begin to address the Deepest Propaganda of All. |