How Christianity Lost Its Voice
Historians can keep arguing about whether or not America was founded as a "Christian" country, but one thing is for sure: For most of this country's history, Christianity has been the dominate cultural force. From the engine behind social service outreaches, to the founding of our greatest educational institutions and hospitals, to prayers before government sessions and sporting events, the Christian faith has made an indelible mark on our society -- at least until recently.
Today that voice is in decline. Although a 2012 Pew Forum Study indicated that 73 percent of Americans still identify themselves as "Christian," that number would drop dramatically when it comes to significant areas of influence such as the media, academia, business, entertainment and more.Read Full Article »