Amid the flood of public opinion surveys that arrive almost daily on every topic imaginable, it's easy to forget that it wasn't always this way. Back in 1979, the year American Demographics was born, the river was just starting to rise. The CBS News/New York Times poll, the AP/NBC News poll and the Los Angeles Times poll were canvassing regularly and turning the results into news stories, greatly expanding on the public polling pioneered by Gallup, Harris, and Roper. Today, ABC News/Washington Post, CNN/Time/Gallup, and NBC News/Wall Street Journal (replacing the AP/NBC partnership) have all jumped into the fray, joined by many other newspapers and television stations as poll sponsors.
In addition, industry trade groups, labor unions, and all manner of issue-oriented organizations have been commissioning polls of their own in the 1990s, each seeking to demonstrate that their stance on the issues has public support. These privately commissioned polls often have the useful impact of replacing policymakers' educated guesses with hard numbers, as well as challenging conventional wisdom about what is happening outside the Beltway.
But even though we now have a great many more polls to sort through, that doesn't mean we are getting more or better information.
Kathleen Frankovic, director of surveys at CBS News, recently provided some interesting numbers on the explosion in polling and that explosion's limits. At a seminar on impeachment sponsored by the American Association for Public Opinion Research and the Freedom Forum's Media Studies Center, Frankovic compared polling during Nixon's Watergate scandal to Clinton's Monicagate.
The word Watergate was used in 355 questions between June 1972 and the end of 1974, the year President Nixon resigned. From January to October of 1998, there were 1,233 questions asked using the words Monica Lewinsky-more than three times as many-according to the POLL database at the University of Connecticut's Roper Center for Public Opinion Research.
Only 95 questions mentioning impeachment were asked by the public pollsters in all of 1973 and 1974, a pittance compared to today's frenzy: Frankovic found that 415 questions were asked using the word impeachment in the first ten months of 1998.
Having several polls that look at the same subject over time can be a good thing: When the poll findings roughly agree, it reassures us that the results are tapping into genuine opinion, not just fleeting thoughts about the day's news. When such polls disagree, however, it is time to ask questions about the polls themselves and just what they are measuring. The 1998 elections offered several examples of preelection polls on that differed dramatically from others tracking the same race. Not surprisingly, the poll that differed from the other surveys was also the one that differed the most from the actual election returns.
There is, in addition, a point of diminishing returns, after which more polls on the same topic simply do not offer greater understanding of the public's views. Many of the 1,233 Lewinsky questions did not contribute much to our knowledge of the way Americans felt about more complicated issues regarding the independent counsel, say, or the larger impact of impeachment. Rather, the flood of questions tended to add only to the weariness with which many people came to regard the scandal.
The deluge of polls on a single topic can also obscure differences between the polls themselves (some of which are sloppily done, using suspect methodologies), making it difficult, if notimpossible, for the public to sort out the reliable information from the unreliable.
Public polling has grown and matured a great deal since American Demographics first hit the newsstands two decades ago. But there are still rough edges to consider as we head toward the year 2000.