We live in a time of exile. At least those of us do who hold to traditional Christian beliefs. The strident rhetoric of scientism has made belief in the supernatural look ridiculous. The Pill, no-fault divorce, and now gay marriage have made traditional sexual ethics look outmoded at best and hateful at worst. The Western public square is no longer a place where Christians feel they belong with any degree of comfort.
For Christians in the United States, this is particularly disorienting. In Europe, Christianity was pushed to the margins over a couple of centuries—the tide of faith retreated “with tremulous cadence slow.” In America, the process seems to be happening much more rapidly.
Read Full Article »