Religion in America has been rocked in recent decades by two societal shifts: the rise of Christian evangelicals as a right-wing political force, and the increasing number of people who decline to affiliate with any faith tradition.
New research presents evidence that these trends, usually discussed separately, are in fact related. It reports the rate at which people disassociate themselves from religion is higher in states where the Christian right exerts its political muscle.
Read Full Article »