We hear over and over from that the United States of America is a Christian nation. But is it true?
Certainly, the creation of the US was influenced by Christianity, among other things. But was Christianity the biggest factor, or the most essential one?
Read Full Article »