In the 21st century the idea that the United States was founded as a Christian nation is alive and well. One might even say that this view of the American past is thriving. Those who believe that America is a Christian nation are serious about their faith in God and country. They have an earnest desire to influence the nation for Christ and celebrate the freedoms we enjoy as citizens of the United States. They find the study of history as one way of promoting this belief.
Read Full Article »