By Alan Cooperman
(Editor’s note: This post is a response to Jacques Berlinerblau’s critique of the Pew religion survey, which you can read here.)
Most of the polling conducted by the Pew Research Center’s Forum on Religion & Public Life measures public opinions and attitudes. The U.S. Religious Knowledge Survey represents the Pew Forum’s first attempt to measure how much U.S. adults know – factually – about their own faiths and other major world religions. We’re delighted by the interest the survey has generated among journalists, scholars and the general public. We’re grateful, too, for thoughtful criticism and discussion of the best ways to measure religious knowledge. However, Jacques Berlinerblau’s blogpost begs a few clarifications.
–“I don’t think Pew gets atheists,” Prof. Berlinerblau writes, adding (in italics for emphasis) that atheists are “usually not born and raised as atheists.” Actually, that point was made forcefully, with statistics from our survey to back it up, by Pew Forum senior researcher Greg Smith at the Sept. 28 symposium in which we released and discussed the survey results (the transcript is on our website). “We know that very few people were raised as atheists or agnostics. Indeed, our data show that about three-quarters of atheists and agnostics say that they were raised as Christians,” Smith said.
–In characterizing the survey results, Prof. Berlinerblau uses phrases such as “graded out,” “dunce cap,” and “ace the test.” Those are his phrases, not ours. As the preface to the report explains, we did not give the public (or any part of the public, including any religious group) an “A,” an “F” or any other grade because “we have no objective way of determining how much the public should know about religion.” True, one of our outside advisors, Professor Stephen Prothero of Boston University, has said that in his view, Americans as a whole “flunked” the survey. But he is free to make such statements, even if they run contrary to our analysis of the data, because his opinions are his own and not those of the Pew Forum.
–Which brings up an important point. We routinely consult outside scholars in designing our surveys; they bring valuable expertise in many areas. In this case, they included Professor Prothero, John Green, Marilyn Mellowes and, to a lesser extent, Charles Haynes and E.J. Dionne. The scholars we consult maintain their academic freedom. We are grateful for their help, but we do not endorse their policy prescriptions or other positions – past, present or future. As Prof. Berlinerblau notes, the Pew Research Center does not take positions on the issues it covers.
–In discussing the survey’s Bible-related questions, Prof. Berlinerblau mentions five (omitting two) and contends that the right answers are “factoids” that don’t comprise “serious knowledge” but “might make for a promising contestant on Jeopardy.” As we said in our report, we do not claim that the questions we asked are necessarily the most important things to know about religion, but we hope they are not trivial, and we think they are good indicators of levels of knowledge. (In that sense, the Jeopardy comparison is not such a damaging one: A person who can correctly answer most or all of the survey’s seven Bible questions probably knows a good many other things about the Bible, too.)
There are a number of other misapprehensions in Prof. Berlinerblau’s blogpost, but they are mostly addressed in the FAQs section of the Pew Forum report, which explains, for example, why sample sizes do not allow us to report results for smaller religious groups or sub-groups, such as Muslims and Orthodox Christians. And any visitor to our website will see that we have conducted numerous surveys exploring religious and political attitudes in the United States, including what Prof. Berlinerblau describes as the “real, hot, ideological action” over social issues.
Alan Cooperman is Associate Director for Research, Pew Research Center’s Forum on Religion & Public Life.