For most of their history, the United States have been Christian in practice as well as in profession.