In this age of intense political and religious division, many Christian nationalists have convinced themselves that they have a special claim on America. That simply is not true. The United States was not founded as a “Christian nation." The Constitution enshrines a clear separation of church and state, and no version of Christianity has ever been the official national religion. ...In reality, such believers in "White Christianity" are a diminishing minority in American society, even as they aspire to be the dominant force and to silence those they disagree with, by any means necessary. In a fateful attempt to win and hold power, the Christian right forged an alliance with Donald Trump and his neofascist MAGA movement. This has been a transactional relationship, given that Trump transparently violates almost every supposed tenet of Christian faith and doctrine. Through almost any religious lens, he can reasonably be described as an unrepentant sinner.
Read Full Article »