photograph of zen garden at Kodaiji temple
"Kyoto Kodaiji temple zen garden" by arakias (via depositphotos)

Visitors to the 400-year old Kodaiji Temple in Kyoto, Japan can now listen to a sermon from an unusual priest—Mindar—a robot designed to resemble Kannon, the Buddhist deity of mercy. In a country in which religious affiliation is on the decline, the hope is that this million-dollar robot will do some work toward reinvigorating the faith.

For some, the robot represents a new way of engaging with religion. Technology is now a regular part of life, so integrating it into faith tradition is a way of modernizing religious practice that also retains and respects its historical elements. Adherents may feel increasingly alienated from conventional, ancient ways of conveying religious messages. But perhaps it is the way that the message is being presented, and not the message itself, that is in need of reform. Robotic priests pose an intriguing solution to this problem.

One unique and potentially useful feature of the robot is that it will never die. It is currently not a machine that can learn, but its creators are hopeful that it can be tailored to become one. If this happens, the robot can share with its ministry all of the knowledge that comes with its varied interactions with the faithful over the course of many years. This is a knowledge base that no mortal priest could hope to obtain.

Mindar is unusual but not unique among priests. A robotic Hindu priest also exists that was programmed to perform the Hindu aarti ritual. In the Christian tradition there is the German Protestant BlessU-2, a much less humanoid robot programed to commemorate the passing of 500 years since Martin Luther wrote his Ninety-Five Theses, by delivering 10,000 blessings to visiting faithful. For Catholics, there is SanTO, a robotic priest designed to provide spiritual comfort to disadvantaged populations such as the elderly or infirm, who may not be able to make it to church regularly, if at all.

To many, the notion of a robotic priest seems at best like a category mistake and at worst like an abomination. For instance, many religious people believe in the existence of a soul, and following a religious path is often perceived as a way of saving that soul. A robot that does not have an immortal soul is not well suited to offer guidance to beings that possess such souls.

Still others may think of the whole thing as a parlor trick—a science fiction recasting of medieval phenomena like fraudulent relics or the selling of indulgences. It is faith, love of God, or a commitment to living a particular kind of life that should bring a person to a place of worship, not the promise of blessings from a robot.

To still others, the practice may seem sacrilegious. Seeking the religious counsel of a robot, venerating the wisdom of an entity constructed by a human being may be impious in the same way that worshiping an idol is impious.

Others may argue that robotic ministry misses something fundamental about the value of priesthood. Historically, priests have been persons. As persons, they share certain traits in common with their parishioners. They are mortal and they recognize their own mortality. They take themselves to be free and they experience the anguish that comes with the weight of that freedom. They struggle to be the best versions of themselves, tempted regularly by the thrills in life that might divert them from that path. Persons are often the kinds of beings that are subject to weakness of will—they find themselves doing what they know is against their own long term interests. Robots don’t have these experiences.

Priests that are persons can experience awe in response to the beauty and magnitude of the universe and can also experience the existential dread that sometimes comes along with being a mortal, embodied being in a universe that sometimes feels incomprehensibly cold and unfair. For many, religion brings with it the promise of hope. Priests are the messengers of that hope, and they are effective because they deliver the message as participants in the human condition.

Relatedly, one might think that a priest is a special form of caregiver. In order to give effective care, the caregiver must be capable of experiencing empathy. Even if robots are programmed to perform tasks that satisfy the needs of parishioners, this assistance wouldn’t be conducted in an empathetic way, and the action wouldn’t be motivated by a genuine attitude of care for the parishioner.

One might think that human priests are in a good position to give sound advice. Though that may (in some cases) be true, there is no reason to think that robots can’t also give good advice if they are programmed with the right kind of advice to give. What’s more, they may be uncompromised by the cognitive bias and human frailty of a typical priest. As a result, they may be less likely to guide someone astray.

Of course, as is often the case in conversations about robotics and artificial intelligence, there are some metaphysical questions lingering behind the scenes that may challenge our initial response to the appropriateness of robotic priests. One argument against priests like Mindar may be that the actions that Mindar performs are, in some way, inauthentic because they come about, not as the result of the free choices that Mindar has made, but instead as a result of Mindar’s programming. If we think this is a significant problem for Mindar and that this consideration precludes Mindar from being a priest, we’ll have to do some careful reflection on the human condition. To what degree are human beings similarly programmed? Physical things are subject to causal laws and it seems that those causal laws, taken together with facts about the universe, necessitate what those physical things will do. Human beings are also physical things. Are our actions causally determined? If so, are the actions of a human priest really any more authentic than the actions of a robotic one? Even if facts about our physical nature are not enough to render our actions inauthentic, human beings are also strongly socially conditioned. Does this social conditioning count as programming?

In the end, these considerations may ultimately point to a single worry: technology like this threatens to further alienate us from ourselves, our situation, and our fellow human beings. For many, the ability to respond to vital human interests like love, care, sex, death, hope, suffering, empathy, and compassion must come from genuine, imperfect, spontaneous human interaction with another struggling in a similar predicament. Whatever response we receive may prove far less important than the comfort that comes from knowing we are heard.

Rachel is an Assistant Professor of Philosophy at Utah State University. Her research interests include the nature of personhood and the self, animal minds and animal ethics, environmental ethics, and ethics and technology. She is the co-host of the pop culture and philosophy podcast I Think Therefore I Fan.