Take a Closer Look at Religion Polling
Secular religion writers love data because so much of what we write about is not quantifiable.
We particularly love folks like Pew and Barna and ARIS and Hartford Seminary and the GSS and the like who do surveys that ask masses of people religiously interesting questions.
Baylor University has tried to nudge its work into that top rank with a series of surveys conducted over the past six years. Its latest results were reported last week. And there's no question the topics addressed are top-rate: Attitudes toward gay marriage and civil unions, business, belief in absolute truth, etc. And while you're rummaging around, I'll tell you why I can't take it as seriously as the researchers would like.
One problem is a lack of transparency -- an unwillingness to tell us who was surveyed. Oh, there's a methodology section that explains how Gallup searched out last year what it says was a random selection of 1,714 adults. And there's an appendix showing what the questions were.
But Baylor won't tell us who they talked to.
Nor do they supply the raw data of how all the respondants answered the questions -- what pollsters call a topline. And this isn't a brief executive summary where the absence might be barely acceptable. We're talking 68 pages of text and graphics.
Why does this matter? Because we can't assume that a poll is legit just because the pollsters say so.
The first thing I do when I get a new major religion-themed poll report is head for the basic breakdown of faith groups. That tells me two things: How deeply the pollsters drilled -- does the poll report answers from Christians only, Catholics and Protestants, Catholics plus a long list of Protestant denominations? And it lets me check this poll against the many prior surveys. I usually use the "none" group as my quick check. It's easy to crosscheck against other recent polls and I'm intrigued by how the needle has continued to rise for that group across many surveys over the past several decades.
But a poll that has a dramatically different percentage of people who say they belong to no particular religion (or any particular religion, for that matter) raises an immediate red flag for me. Because either that poll is right or all the others are wrong. And if it is significantly wrong about its large-scale breakdowns, how can I trust the poll on finer-grained analysis?
Most major survey reports about religion deliver their chart of religious affiliations right up front or in an easily identified appendix. Not so much, for "The Values and Beliefs of the American Public, Wave III, Baylor Religion Survey."
In fact, this report offers no comprehensive list of demographic data. Not age, income, political affiliation, race, or any of the other myriad ways that surveys slice and dice a population. And I'd suggest that Baylor has a higher bar to meet for transparency because of what the school is: An explicitly Baptist university with a theological point of view presented and promoted by the administration.
There is nothing remotely wrong with that. And it is totally possible for researchers at Baylor to do excellent independent work on questions for which the university's chosen theology offers answers that are considered received wisdom from the Almighty.
But you'll excuse me if I don't take their word for it.
I'd have the same kinds of question if the Orthodox Union were presenting a survey about American attitudes toward kosher food, or if CAIR were presenting a poll about how Americans feel about mosques. In fact, I'd demand the same openness about data in a report produced at Harvard or (insert your favorite pointy-headed secular elite school here).
The Baylor report as delivered doesn't exactly allay my concerns about an unholy mix of religion and analysis, what with the titles of some of its chapters.
Consider "How God Sustains The American Dream." That phrase seems to include an implicit assumption or two, yes?
The chapter is actually about how belief in God, and how particular beliefs about God, seem to correlate with some aspects of demographics and politics. For instance, according to the report, people who believe strongly that there's a God who has a plan for them tend to have less education and make less money. And they're more likely to say that the government does too much and that able-bodied people should not collect unemployment.
Potentially interesting data, all. None of which speaks to the reality of an Almighty or what he might think the "American Dream" ought to be or whether he chooses to sustain it.
And even what should be straightforward analysis seems to leak theology. Here's a line: "Americans who feel strongly that God has something wonderful in store for them tend to..."
The survey question it refers to is straightforward, asking respondants whether they believe that "God has a plan for me." Whether such a plan would necessarily seem "wonderful" is a question even traditional Christian theologians have wrested with for a couple of millennia. See: Theodicy.
Another chapter is titled "Liberals are idealists? Conservatives are realists? Think again." The analysis says: "Liberals have been historically and popularly thought of as idealists - individuals who have high ideals and believe they can be realized -- while conservatives are often depicted as realists -- those willing to consider real world problem-solving over ‘pie in the sky' thinking."
Think again? I'm trying to think about this the first time. Says who? Based on what?
Some aspects of both political (and religious) conservatism have always included a powerful yearning for ideals that may or may not have ever existed. Ronald Reagan's "Morning in America" was positively lyrical in its romantic hankering for an idyllic society that could be realized here on Earth.
And some on the left have tried to characterize their position as reflecting an absolute connection with truth. There's a popular bumper sticker (popular on the left, anyway) that flat-out says "Reality has a well-known liberal bias." Both sides, of course, also include vise-versas: Conservatives who pride themselves on Realpolitik. And liberals who claim allegiance to a cause and an ideal that only tangentially reflect the world as it is.
It would have been easy enough for the Baylor researchers to present the poll results without a questionable lead-in. As they say in tennis, these are unforced errors.
I'll contrast the Baylor report with the reports by Christian pollster George Barna. He's absolutely transparent about his theological point of view. He defines his terms to a fare-thee-well. He's also totally transparent about his methodology. He offers enough basic demographic information with many of his reports so that an interested reader can easily check it against other surveys.
And he's not even claiming to do disinterested secular research. He's trying to sell books. And his analysis nearly always measures his data against the direction he'd clearly prefer it to be moving. Here's an example of a Barna report, about spirituality since 2001. The definitions are clear, the larger faith-group breakdowns are explicit and the point of view is straightforward. One need not agree with any of Barna's theology to find his work interesting and useful.
The Baylor report falls short of Barna in two ways. It's neither transparent enough nor unbiased enough in its language to be considered as pure research. And if it really wants to be using good data in the cause of a promoting a particular Baptist perspective, it's not remotely straightforward about that, either.