Was America Ever a Christian Nation?

A newly released report from the Public Religion Research Institute just shed some light on how religious Americans feel the nation has changed from previous years.

As a part of a survey that also looked at LGBT issues (as my colleague Kelsey Dallas reported on), this PRRI study found there’s a growing feeling among evangelical Christians that America is losing its Christian influence.

Read Full Article »


Comment
Show comments Hide Comments


Related Articles