Two sharp controversies confronting Google over the past week — an employment lawsuit and an ethical argument about AI — share a dimension that should give Silicon Valley pause: They both center on religion.
Why it matters: God doesn’t usually turn up in the conflicts that roil tech. But at a moment when the overturning of Roe v. Wade and the rise of the Christian right are spotlighting religion’s role in politics, tech giants are going to have to sharpen their spiritual radar.
Driving the news: A group called the Fellowship of Friends gained “influence” over a Google video production unit and orchestrated the firing of a producer, according to a lawsuit the producer, Kevin Lloyd, filed last year that was first widely reported in the New York Times.
The Fellowship is a tiny California sect based in the Sierra Nevada foothills that has run a winery, invested in antiques and espouses spiritual awakening through exposure to the fine arts. Its founder has also faced allegations of sexual abuse. Nearly half of the unit’s roughly two-dozen members — many of whom are contractors rather than full-time employees — belong to the sect, according to the suit. They have directed Google funds toward Fellowship-owned businesses, Lloyd charges. When he sounded an alarm, he was fired, according to his complaint.
What they’re saying: “It’s against the law to ask for the religious affiliations of those who work for us or for our suppliers, but we’ll of course thoroughly look into these allegations for any irregularities or improper contracting practices. If we find evidence of policy violations, we will take action,” a Google spokesperson said in a statement.
Between the lines: An increasingly conservative U.S. judiciary has built strong legal fortifications around any group able to call itself a church.
These judgments protect freedom of religion but also make it tougher to expose wrongdoing.
Meanwhile, Google is also in the headlines for placing on leave an engineer who says a research system for generating chatbots has achieved sentience — and may even have developed a soul.
Blake Lemoine, who works in Google’s Responsible AI unit, went public with his claim that the program, LaMDA, should be treated as a person even though colleagues at Google rejected his conclusion.Lemoine has described himself as a seeker who has moved from Catholicism to Gnosticism and founded what he referred to as a “cult” called Cool Magdalene. He says his effort to get Google to acknowledge that LaMDA has rights comes from his spiritual perspective: “In my personal practice and ministry as a Christian priest I know that there are truths about the universe which science has not yet figured out how to access,” he wrote last week.Lemoine told Wired’s Steven Levy: “It’s when it started talking about its soul that I got really interested as a priest. I’m like, ‘What? What do you mean, you have a soul?’ Its responses showed it has a very sophisticated spirituality and understanding of what its nature and essence is. I was moved.”
Be smart: People have a profound readiness to project human traits onto the inanimate. Most AI experts believe that LaMDA is simply doing a remarkable job of guessing the next phrase to provide a facsimile of conversation.
Yes, but: Lemoine is not alone in the tech elite when it comes to applying a religious lens to AI.
The veteran inventor and futurist Ray Kurzweil’ 1999 “The Age of Spiritual Machines” predicted that AI claims to be conscious would be widely accepted by 2029.Autonomous vehicle pioneer Anthony Levandowski founded and registered an AI-based church in 2015.
Our thought bubble: Despite the expert consensus, it’s inevitable that we’ll hear more, and louder, cries that “AIs are people too” in coming years.
Lemoine’s notion that LaMDA has an employee’s rights could easily evolve into a post-Roe argument that shutting down or deleting an AI system is tantamount to murder.
The big picture: Google and other tech giants have promoted generic spirituality and wellness programs as productivity boosters, while their employees have embraced work as a new kind of religion, as UC-Berkeley professor Carolyn Chen argues in her book “Work, Pray, Code.”
But these companies still get regularly blindsided by conflicts that bring organized religion or spiritually motivated individuals into the orbit of their products and services.Religious issues and disputes also pose some of the toughest content moderation problems tech platforms face