Most people know of AI as a tool that will write papers, complete assignments or generate streams of content at the click of a button, like ChatGPT. Debates about the pros and cons of AI have been in conversation for years, and BNG has been covering it as new technologies develop, especially technologies that can do the work ministers do, like writing sermons.
But what about hurtful AI technology?
Over the years, AI technology has been developed that can formulate a fake image or video of a person doing just about anything. At first, this was used only in movies, often to include an actor or character who died before the making of the film, such as John F. Kennedy’s cameo in Forrest Gump in 1994. These are called “deep fakes.” Now, this technology is accessible to the public.
“All you need is an image or video of that person, and a deep fake can be made without their permission.”
And while such technology might allow a person to portray themselves on a dream vacation, with their dream body and their dream romantic partner, the software also can be used on people without their consent. All you need is an image or video of that person, and a deep fake can be made without their permission.
Of course, this has led to extreme violations of privacy, including the creation and distribution of AI-generated pornographic images, especially of women, on the internet without the consent of the person whose likeness is being used.
Not only are these images embarrassing, but it also is very hard to tell if they are fake or real. This has sent social media users into spirals on multiple occasions, as popular women’s images have been used to create pornographic content that was spread throughout their fan base. When the women’s social media pages are flooded with harassing messages related to the content, they are confused, shocked and sent into a frenzy as they worry if their reputation has been destroyed.
And these women have done nothing.
You do not even have to post a promiscuous photo for AI technology to be able to produce a pornographic deep fake about you. All it needs is an image of your face, and the technology generates the rest.
In fact, an increasing number of mothers are choosing not to show their children’s faces online, fearing pedophiles might generate and distribute pornography in the likeness of their children.
It goes without being said, but I will say it anyway: This is an extreme violation of privacy and consent.
“How far is too far when we are sharing public images?”
And it begs a long-awaited question for internet users: How far is too far when we are sharing public images?
Mark Wingfield, executive director and publisher here at BNG, recently covered a story about the laws and regulations related to properly using and distributing the images we use for articles. There are serious consequences if you use someone else’s digital property without asking first. (That’s also why in this article we cannot show you clear examples of deep fakes, because there’s no way to know who owns the copyright to the original image used to create the fake.)
However, casual image-sharing has been one way the internet has thrived since the inception of social media.
We all remember the earliest versions of memes, which were just edited versions of the same few images, sometimes with text pasted on top. Or the “Charlie bit my finger” video that unintentionally took the internet by storm on YouTube more than a decade ago.
When we share images owned by or including humans, is it a violation of their privacy to share them online over and over again? Did anyone ask them if it was OK to treat their appearance or likeness as a joke? One might argue that by simply being online, you are putting yourself at risk of these things. However, posting images online does not mean consent for those images to be distorted and manipulated. And what about photos or videos that are posted of you without your knowledge in the first place?
Public information can be grabbed and used by anyone, but shouldn’t we care about the consequences?
“AI pornography draws a sharp line on this debate.”
The extremity of deep fakes is waking us up to this reality. And AI pornography draws a sharp line on this debate. Software to digitally undress non-consenting people is a shocking step over the line.
Or it should be. But it’s not.
Because women all over social media are having to come to terms with the fact that this is what it means to be a woman in the age of AI. If your photo ever has been posted online by you or someone else, there may be pornography in your likeness generated without your knowledge or consent. And there is very little we can do to stop it.
We, as a society, have become so desensitized to this kind of violation that we feel powerless against it.
While we may be able to cover our bodies with clothing, restrict our social media pages, say “no” to sexual advances and remain as modest as we want, we still do not have control over who has access to our bodies behind the AI generators. This is no longer a conversation about basic consent, bodily autonomy or sexual exploitation. I’m afraid this is technological warfare against sexual freedom.
What does this mean for the future of the church?
Amid the grand debates about sexuality, this raises new issues to address.
How do ministry leaders care for congregation members who experience sexual harassment via deep fake technology? The church already has a poor track record for responding to the high amounts of physical sexual violence that occur within our congregations. We need to start thinking of ways to address the psychological and spiritual trauma this technology will inevitably cause church members during our lifetimes.
Further, what happens when someone uses AI to generate pornography using the face of a female ministry leader?
The nonconsensual distribution of revealing photos that were consensually taken by or of a person (“revenge porn”) to destroy their reputation or hinder their ability to do well in a career path has been happening for quite some time. However, this new technology allows pornography to be created in the likeness of anyone, even people who never would allow a nude or promiscuous photo of themselves to be taken.
Female minsters are now at a high risk of sexualization from inappropriate church members, or worse, retaliation from those who think they shouldn’t be doing the work they do.
How does a minister maintain authority and respect among her congregation when deep fake technology challenges their image as a servant of the Lord?
While the church cannot erase this technology or strip a congregation’s presence from social media to prevent deep fake risk, the church can take a stance against it. The church also can be present with nonjudgmental open arms to care for those who are violated by AI generated pornography.
More than ever before, we cannot be afraid to talk about sexual freedom in church.
Mallory Challis is a master of divinity student at Wake Forest University School of Divinity. She is a graduate of Wingate University and is a former BNG Clemons Fellow.
Related articles:
This article was written using Chat GPT AI | Opinion by Mallory Challis
Reverend Roboto: Artificial intelligence and pastoral care | Analysis by Kristen Thomason