There is a very bad argument for celibacy which has reared its head throughout the tradition and which is, even today, defended by some. It runs something like this: married life is morally and spiritually suspect; priests, as religious leaders, should be spiritual athletes above reproach; therefore, priests shouldn’t be married.
I love Augustine, but it is hard to deny that this kind of argumentation finds support in some of Augustine’s more unfortunate reflections on sexuality (original sin as a sexually transmitted disease; sex even within marriage is venially sinful; the birth of a baby associated with excretion, etc.). I once ran across a book in which the author presented a version of this justification, appealing to the purity codes in the book of Leviticus. His implication was that any sort of sexual contact, even within marriage, would render a minister at the altar impure. This approach to the question is, in my judgment, not just silly but dangerous, for it rests on assumptions that are repugnant to good Christian metaphysics.
The doctrine of creation ex nihilo necessarily implies the essential integrity of the world and everything in it. Genesis tells us that God found each thing he had made good and that he found the ensemble of creatures very good. Expressing the same idea with typical scholastic understatement, Thomas Aquinas commented that “being” and “good” are convertible terms. Catholic theology, at its best, has always been resolutely anti-Manichaean, anti-gnostic, anti-dualist — and this means that matter, the body, and sexual activity are never, in themselves, to be despised. In his book “A People Adrift,” Peter Steinfels correctly suggests that the post-conciliar reaffirmation of this aspect of the tradition effectively undermined the dualist justification for celibacy that I sketched above.
But there is more to the doctrine of creation than an affirmation of the goodness of the world. To say that the finite realm in its entirety is created is to imply that nothing in the universe is God. All aspects of created reality reflect God, point to God, and bear traces of the divine goodness (just as every detail of a building gives evidence of the mind of the architect), but no creature and no collectivity of creatures is divine (just as no part of a structure is the architect). This essential distinction between God and the world is the ground for the anti-idolatry principle that is reiterated from beginning to end of the Bible: Do not turn something which is less than God into God. Isaiah the prophet put it thus: “As high as the heavens are above the earth, so high are my thoughts above your thoughts and my ways above your ways, says the Lord.” And it is at the heart of the first commandment: “I am the Lord your God; you shall have no other gods besides me.” The Bible thus holds off all forms of pantheism, immanentism, and nature mysticism — all the attempts of human beings to divinize or render ultimate some worldly reality. The doctrine of creation, in a word, involves both a great “yes” and a great “no” to the universe.
I had the privilege of hearing Dr. Billy Graham preach about twenty years ago in Cincinnati. At the time, Dr. Graham was around eighty years old and clearly in frail health. He came to the podium and commenced to speak, but the crowd of young people, stirred up by the Christian rock bands who had performed earlier, was restive and inattentive. Graham paused, folded his hands, and quietly said, “Let us pray.” With that, a stadium of fifty thousand people fell silent. Once a spirit of reverence held sway, the preacher resumed. I remember thinking, “What an old pro!”
That old pro, arguably the greatest Christian evangelist of the past hundred years, died this week at ninety-nine, and it’s difficult to overstate his impact and importance. It is said that he directly addressed 215 million people in 185 countries in the course of his ministry. No other preacher, in the entire history of Christianity, has had such a range. At the height of his powers, he filled arenas and stadiums, for weeks at a time, in some of the most jaded, materialistic, and skeptical cities in the world. And when preachers and other religious celebrities all around him were falling into scandal and corruption, Billy Graham stood tall, a man of integrity. His moral heroism was on particularly clear display in the early years of the civil rights movement. Especially in his native South, it was the unquestioned practice to seat black people in segregated sections of churches and arenas. Though it cost him quite a few of his traditional supporters, Graham insisted that his crusades should be racially integrated. Impressed by this show of courage, Martin Luther King Jr. became a friend and appeared with Graham at a crusade in 1957.
What was it about his preaching that was so compelling? I suppose in his early years, he demonstrated a fair amount of “flash,” prowling the stage, waving his arms, and moving dramatically from whispering to shouting. But as he matured, a fair amount of that theatricality faded away. What remained was a gentle sense of humor (usually self-deprecating), an obvious sincerity, a keen intelligence, and above all, a clarity in regard to the essentials of the Gospel. Practically every Billy Graham sermon had the same basic structure: you have sought happiness in wealth, pleasure, material things, fame, etc., and you’ve never been satisfied; I want to tell you about what will make you happy. At this point, he would speak of Christ crucified and risen from the dead. Now please don’t get me wrong — and don’t write me letters! As a Catholic, I affirm that there is more to salvation than accepting Jesus Christ in faith; there is the full integration into the life of Christ that happens through the instrumentality of the Church and her sacraments. Nevertheless, Catholics and Protestants come together in asserting — as Billy Graham consistently did — that we are sinners who stand in need of Christ’s saving grace. In point of fact, a generous ecumenism was one of the marks of Billy Graham’s approach. It didn’t bother him in the least if someone whose religious journey commenced at one of his crusades continued and came to fulfillment in the Catholic Church.
Much has been made of his relationship with presidents, monarchs, and prime ministers. He did indeed minister personally to twelve US presidents, and the wonderful Netflix series The Crown shows something of the impact he had on Queen Elizabeth II. But I’ve never been particularly taken with this dimension of Graham’s life, which seemed, to me anyway, more sizzle than steak. In fact, one of the low points of his career had to have been his meek acquiescence to Richard Nixon’s anti-Semitic musings, captured on White House tapes. To his credit, Dr. Graham repeatedly apologized for that lapse. He was far more powerful and spiritually efficacious when he prayed over the thousands of ordinary people who had responded to an altar call at the close of a crusade.
It should be a matter of concern to more than Catholics that the Vatican is considering an agreement with Beijing that would give the Chinese Communist Party (CCP) effective control over all Roman Catholics in mainland China and Hong Kong.
Presently, there are two communities in China with the word “Catholic” in their name. One is named the Catholic Patriotic Association (CPA) and is led by “illegitimate” bishops appointed by the Chinese government with neither input nor approval from the Vatican. The CPA toes the line for a government that, among other things, forbids all attempts at evangelization. As Cardinal Joseph Zen put it, “The official bishops are not really preaching the gospel. They are preaching obedience to Communist authority.”
The true Catholic Church in China, in communion with the Holy See and in the line of apostolic succession — things that are crucial to Catholics — is an underground church led by bishops that have been consecrated and sent from the Vatican. It meets in secret and under constant threat of suppression and jail time for its priests.
The state-controlled CPA is the latest of the CCP’s attempts to remove religion as an independent force in China. Since Mao Dzedong and the Red Army took over China in 1947, all Christian churches have been suppressed and persecuted. Persecution of the Catholic Church, which at the time of the revolution accounted for three-quarters of China’s Christians, was particularly severe and unrelenting. Under Mao and his Cultural Revolution, the physical violence against the church and torture of priests and religious became even worse. (One could go on for pages with just the highlights of atrocities committed under Mao’s regime.)
Here is a Christmas story I’d prefer not to have to share.
On December 6th, the day before St. Nicholas’s feast day, the Archdiocese of Washington sat in federal court arguing for an emergency injunction that would require the Washington Metropolitan Area Transportation Authority (WMATA) to run our Christmas ad campaign on the backs of their buses. It’s called “Find the Perfect Gift.” The ad only hints at the faith message the archdiocese is sharing.
The visitor to our website is greeted with a number of options: a list of Christmas Mass times, videos to learn about other faiths’ Christmas traditions, or how to make a family Advent wreath. There are resources to volunteer to help others during the season. There are messages explaining how the season is about the birth of the Christ Child, and what it means for humanity. In other words, it’s what one might expect from any faith-based organization seeking to share a Christmas message.
WMATA turned down the ad, which has no religious imagery, because its advertising guidelines bar religious advertising. But during my discussions with the ad sales folks at WMATA they indicated that if there was a way we could make the ads more “commercial” they might be able to run them. By “commercial” they meant selling something, say if the Catholic Church were selling tickets to a concert or a Mass. But the church doesn’t do that, so there wasn’t really a way to change the ad.
Last week, President Trump announced that the United States would recognize Jerusalem as the capital of Israel and would soon relocate the country’s embassy from Tel Aviv to the Holy City. This news made headlines around the world, both for its reversal of America’s longstanding foreign policy and its ability to reshape the region’s geopolitics.
As this story continues to unfold in the coming weeks, here are the five things you need to know about President Trump’s announcement and the world’s reaction:
1. Though Israel’s declared capital is Jerusalem, the international community has long refused to recognize this designation.
When the modern state of Israel was being formed in 1947, the United Nations wanted to declare Jerusalem an international zone, given its religious importance to Jews, Christians, and Muslims from around the world. Israel officially declared Jerusalem as its capital, but the city remained divided until the 1967 Six-Day War, when it annexed the eastern part of the city. This move was considered illegal under international law. Accordingly, countries from Argentina to Zambia located their embassies in and around Tel Aviv to avoid conflict.
Some people seem to think this is divine retribution for the sins of humanity: Kirk Cameron, former child actor, said in a video on Facebook that Hurricane Harvey and Irma were “a spectacular display of God’s immense power” and were sent so human beings could repent. Earlier, after seeing the devastation of Hurricane Harvey, conservative Christian pastor John McTernan had noted that “God is systematically destroying America” out of anger over “the homosexual agenda.”
Others disagreed over the reasons for God’s anger, but not necessarily with the assumption that God can be wrathful. Jennifer Lawrence suggested that Irma was “mother nature’s rage and wrath” at America for electing Donald Trump.
It is true that many religious traditions, including Judaism and Christianity, have seen natural disasters as divine punishment. But, as a scholar of religion, I would argue that things aren’t that simple.
The Constitution of India declares the nation a secular republic. Despite this, the nation’s left-leaning Congress Party has long sought to appease conservative religionists. That is, as long as they are members of a minority religious community. This results in a deeply hypocritical approach to governance, in which the Congress Party fails to stand up for liberalism and universal human rights in the face of theocrats. The party may pay lip service to progressive causes, but, as we all know, actions speak louder than words.
This is evident in Congress Party Vice President Rahul Gandhi’s “welcoming” of the Indian Supreme Court’s decision a few weeks ago to ban triple talaq, a practice whereby one can instantly divorce one’s spouse simply by stating “talaq, talaq, talaq,” provided that one meets the government’s requirements of being 1) Muslim and 2) male. Both the party and Rahul’s father, former Prime Minister Rajiv Gandhi, protected the practice in an attempt to appease its regressive base in the 1980s. The major opposition party, BJP, is no better. The party claims to defend “Hindu” values, but more often than not simply defends the most regressive attitudes imported by the Christian British and Islamic Mughal colonizers.
To illustrate this dynamic, consider the issue of the Victorian-era ban on homosexuality in India. The Congress Party’s conservative Christian and Muslim minority constituents have defended the ban, as have BJP’s conservative Hindu constituents, leaving only smaller liberal minorities in both parties to attempt to enact change. As Sadanand Dhume explained, “Opposition to homosexuality in India may appear to remain relatively broad, but it doesn’t run particularly deep … [unlike in Islam,] antigay positions lack deep scriptural sanction in Hinduism.” As a result, the Congress Party’s secular elite say they want the ban overturned but have done little to enact this change, and the BJP has tried to appear neutral and uninterested in the issue.
However, when it comes to issues that are explicitly unrelated to Hinduism, the BJP often takes the lead where Congress flounders. Despite claiming the mantle of progressivism, the Congress Party used its supermajority in Parliament in 1986 not to ensure the elimination of triple talaq once and for all, but to instead dilute the Indian Supreme Court’s 1985 decision against the policy. The Party did so out of fear of electoral losses, essentially proving BJP’s accusations of appeasement correct.
Editor’s note: At sundown on August 31, Muslims all over the world will celebrate one of the principal festivals, Eid al-Adha. Earlier in June, Muslims celebrated Eid al-Fitr. Ken Chitwood, Ph.D. candidate studying global Islam, explains the two Islamic festivals.
What is Eid?
Eid literally means a “festival” or “feast” in Arabic. It is celebrated twice a year as Eid al-Adha, (pronounced eed al-Ahd-huh) and Eid al-Fitr.
Why is it celebrated twice a year?
Ehud Sperling couldn’t find what he was looking for in physics classes. He had a half-articulated question about the ultimate nature of reality — the secret reality beneath or behind ordinary reality — but as he listened to lectures about atoms, energy, and the laws of motion, he felt the answer getting further and further away. He switched to psychology. Pysch classes didn’t help him with his question either. Then he went to Donald Weiser’s bookstore.
“Weiser’s was the place to find out,” Sperling recalled, now more than 50 years later. “At that point in time, we’re talking in the late 60s, there was no other place.”
Weiser’s New York store sold occult books. There, you could find tomes on the traditions and technologies of magick. There were books on astrology and astral projection, tarot, the secrets of Egypt, the traditions of Gnosticism, spirit channeling, and the wisdom of the gurus of the East. The sign out front said “esoterica” and “orientalia.”
Donald Weiser died on April 12 at the age of 89. His death was little noted, except for an item in Publishers Weekly and an intimate memorial with friends and family. The truth is, though, that Weiser and his book business changed the religious landscape in America.
The earliest Latin commentary on the Gospels, lost for more than 1,500 years, has been rediscovered and made available in English for the first time. The extraordinary find, a work written by a bishop in northern Italy, Fortunatianus of Aquileia, dates back to the middle of the fourth century.
The biblical text of the manuscript is of particular significance, as it predates the standard Latin version known as the Vulgate and provides new evidence about the earliest form of the Gospels in Latin.
Despite references to this commentary in other ancient works, no copy was known to survive until Dr Lukas Dorfbauer, a researcher from the University of Salzburg, identified Fortunatianus’ text in an anonymous manuscript copied around the year 800 and held in Cologne Cathedral Library. The manuscripts of Cologne Cathedral Library were made available online in 2002.
Scholars had previously been interested in this ninth-century manuscript as the sole witness to a short letter which claimed to be from the Jewish high priest Annas to the Roman philosopher Seneca. They had dismissed the 100-page anonymous Gospel commentary as one of numerous similar works composed in the court of Charlemagne. But when he visited the library in 2012, Dorfbauer, a specialist in such writings, could see that the commentary was much older than the manuscript itself.
I did something for the first time recently — I joined a support group. Two, in fact, on Facebook, run by people suffering from something called trigeminal neuralgia.
TN is a little-known condition that delivers hydrogen bomb size pain to those it afflicts; a pain so difficult to describe I often resort to admitting, "I wouldn't wish it on Hitler," and mostly mean it.
The "suicide pain," as it's also known, involves the trigeminal nerve in one or more of three branches that cross the upper, middle, and lower parts of the face, on each side. For better and worse it is impervious to opioids, and virtually every convention of pain solution. The culprit appears to be a blood vessel, putting pressure on the nerve at its root. Why is unclear. As a neurologist explained to me, "It could be the result of damage caused by an early virus, or maybe head trauma, but the truth is, we just don't know." In any event, the point here is not etiology, but this: visceral pain of such magnitude is a great equalizer, underscoring the truth that we’re all created equal. It is, to put it mildly, a lesson learned the hard way.
Since joining these groups, I've participated regularly, if not frequently, in the long daily threads comprised often of members responding to another who is feeling unable to go on. These confessions are also pleas, and the anguish in them is humbling.
The arrest of polygamist leader Lyle Jeffs, evictions of polygamist families and new studies on crippling genetic disorders among small ultra-orthodox or “fundamentalist” Mormon communities in rural Utah have made headlines this summer.
This spotlight on polygamy is likely to make the majority of Mormons who are nonfundamentalist uncomfortable. The Church of Jesus Christ of Latter-day Saints (LDS) – the mainstream Mormon Church with 15 million members worldwide – publicly rejected polygamy in 1890. But to this day, mainstream Mormons encounter stereotypes of polygamy.
As a scholar of Mormonism and gender and a Mormon myself, I know that the truth about Mormonism and polygamy is complicated and confusing. For more than 175 years, polygamy and tensions surrounding it have defined what it means to be a Mormon – especially a Mormon man.
Founded by Joseph Smith in 1830, the Mormon movement from its beginnings offered a unique perspective on the religious role of men.
Over the past several years, the nation has been torn apart by memory wars. The conflict usually centers on a monument that reflects different historical narratives for different groups. The Confederate flag over the South Carolina statehouse and the Robert E. Lee statue in Charlottesville are two high profile examples. Memory wars are fought when there are conflicting historical narratives that are essential to the identity of a group. While disagreements about race and the legacy of the civil war will continue to dominate the headlines, other clashes over memory are worth noting. One of such is the forthcoming Mormon memory wars.
Religion is often a key aspect of group identity, which makes it a prime motivating force to generate a memory war. There is currently a revolution taking place concerning how the Church of Jesus Christ of Latter Day Saints (LDS Church) understands the key events of its foundation. For over a century, the problematic details of Mormon history were swept under the rug. As the late apostle Boyd Packer said “some things that are true are not very useful.” In the era of the internet, the LDS Church no longer has a monopoly over its history. Several significant attempts have been made to be more transparent with its history. The Gospel Topics, the Joseph Smith Papers project, and even a new Church history are all part of this new strategy. This opens the door for more competition over traditional Mormon historical narratives.
This is planting the seeds for a Mormon memory war. Mormonism is unique among American religions in the role it plays in a person’s identity. The institution has a significant role in creating and maintaining a community that each member operates in. Since the early days of Mormonism and the accompanying persecution, Mormons have always been a “peculiar people” with distinct barriers between those outside the Mormon community and those within. LDS theology outlines a purpose for adherents, and the institution adds cultural facets that further contribute to one’s identity.
There have been critics of core LDS historical narratives since the foundation of Mormonism. This has never resulted in something that one could call a memory war because the critics were traditionally outside the Mormon community. In addition, there was little to fight over because most of the symbols of the key narratives do not reside in the public space, giving outside groups little authority to engage in any memory war.
“If it were proposed to all nations to choose which seemed best of all customs, each, after examination, would place its own first; so well is each convinced that its own are by far the best.” —Herodotus (III, xxxviii)
To be honest, it took about four readings of Rémi Brague’s essay “From What is Left Over” (First Things, August 2017) before I felt confident enough to tell myself I understood it. Some of the perplexity probably stemmed from the word culture being used over 60 times, and not always with the same meaning.
Unlike Isidore of Seville, I don’t think that etymology holds the key to the cosmos, but once I realized my confusion concerned this word culture, I consulted a dictionary to guide me out of the perplexity. Yet I also abided by C.S. Lewis’s instructions: “One understands a word much better if one has met it alive, in its native habitat. So far as is possible our knowledge should be checked and supplemented, not derived, from the dictionary.” And: “everyone starts telling us what the word does not mean; a sure proof that it is beginning to mean just that.”
The Oxford English Dictionary says culture and custom are both of Latin origin. Each came to Modern English via the Normans around the same time. Since the 1300’s custom has always meant modes of habits, behaviors, manners, practices, or as our Congress puts it, “ways and means.” (Even menstruation used to be called “the custom of women.”) The Greek equivalent, nomos, can mean law, nature, or custom, as in the line by Herodotus.
Culture as a synonym for customonly came into contemporary practice in about 1860. Nearly fifty years later Edith Warton, instead of using the word culture, swam against the current by naming her novel The Custom of the Country (1909).
Among the millions of travelers heading out for the summer holidays, some are choosing an unlikely destination: a rusted bus on the edge of the Alaskan wilderness.
Fairbanks Bus 142 (aka the “magic bus”) is where the 24-year old Chris McCandless died in 1992. Well-educated and economically secure, McCandless rejected the materialism he saw in contemporary U.S. society. He set out to explore with only what he could carry, and ended up living off the Alaskan land for a few months before dying of starvation. His story was first told by writer and mountaineer Jon Krakauer in the book “Into the Wild,” and later made into a film directed by Sean Penn.
Since then, dozens of people every year seek to follow in McCandless’ footsteps. Finding inspiration in his mode of self-sufficiency, many head out to Alaska like secular pilgrims seeking to imitate a great saint from long ago, and to live more simply.
“Into the Wild” is not the only film to affect people in such a way. I have found many ways in which films around the world have motivated people to get up and travel to locations previously unknown – what I call “film-induced pilgrimage.” In these travels, tourists begin to look a lot like spiritual seekers.
Pope Francis has created a new category for beatification, the level immediately below sainthood, in the Catholic Church: those who give their lives for others. This is called “oblatio vitae,” the “offer of life” for the well-being of another person.
Martyrs, a special category of saint, also offer up their lives, but they do so for their “Christian faith.” And so, the pope’s decision raises the question: Is the Catholic understanding of sainthood changing?
Most people use the word “saint” to refer to someone who is exceptionally good or “holy.” In the Catholic Church, however, a “saint” has a more specific meaning: someone who has led a life of “heroic virtue.”
This definition includes the four “cardinal” virtues: prudence, temperance, fortitude and justice; as well as the “theological” virtues: faith, hope and charity. A saint displays these qualities in a consistent and exceptional way.
Throughout American history, religion has played a significant role in promoting social reform. From the abolitionist movement of the early 19th century to the civil rights movement of the 20th century, religious leaders have championed progressive political causes.
This legacy is evident today in the group called religious progressives, or the religious left.
The social gospel movement of the late 19th and early 20th centuries, as I have explored in my research, has had a particularly significant impact on the development of the religious left.
What is the social gospel movement and why does it matter today?
A recent letter from the Vatican reminded the world’s Catholic bishops of a rule mandating the use of wheat gluten for the celebration of the Eucharist, a Christian liturgical service called the Mass by Catholics.
Reactions were immediate. Catholics with celiac disease recounted their experiences in trying to find low-gluten options and even approaching priests before Communion to receive consecrated wine from a separate chalice so there was no chance of cross-contamination. Some narrated how they had even refrained from receiving Communion and decided instead on a “spiritual Communion.”
As a specialist in liturgical studies, I was not really surprised. Today in North America there is an intense concern about the nature of bread used for Communion by Catholics – celiac disease, caused by gluten intolerance, affects at least 1 percent of the global population.
But while the Catholic Church does allow low-gluten breads, the use of gluten-free recipes has been strictly prohibited.
Just this past June, at a national meeting of various Hindu organizations in India, a popular preacher, Sadhvi Saraswati, suggested that those who consumed beef should be publicly hanged. Later, at the same conclave, an animal rights activist, Chetan Sharma, said,
“Cow is also the reason for global warming. When she is slaughtered, something called EPW is released, which is directly responsible for global warming. It’s what is called emotional pain waves.”
These provocative remarks come at a time when vigilante Hindu groups in India are lynching people for eating beef. Such killings have increased since Narendra Modi and his right-wing Bharatiya Janata party came to power in September 2014. In September 2015, a 50-year-old Muslim man, Mohammad Akhlaq, was lynched by a mob in a village near New Delhi on suspicion that he had consumed beef. Since then, many attacks by cow vigilante groups have followed. Modi’s government has also prohibited the slaughter of buffalo, thus destroying the Muslim-dominated buffalo meat industry and causing widespread economic hardship.
Most people seem to assume that no Hindu has ever consumed beef. But is this true?