Two months after the November 2022 launch, ChatGPT had more than 100 million users, making it the fastest-growing consumer application in history.
Since then, it’s become the latest craze.
Now we have GPT-4, Bard, Bing, Claude, Jasper, and a myriad of other tools. It’s still the early days of Generative AI but there are huge announcements every day.
AI is not just driving your entertainment recommendations or auto-completing your text messages anymore… there are much more powerful use cases.
Generative AI can produce original novels, compose amazing photo-realistic images, and instantly code websites and apps.
Computer programmers have used it to pass standardized tests, scoring in the 90th percentile on the Bar exam, 1300 on the SAT, and a perfect 5 on AP biology exams. Pastors are using it to compose emails and newsletters, turn sermons into devotions, and instantly create graphics and images.
With the rapid introduction of these tools, companies and entire governments are trying to pump the brakes and give ourselves time to consider the ramifications.
As church conference sessions, online summits, white papers, and articles seek to help you unlock these tools for ministry, it would be wise for you to consider the implications and adopt a more thoughtful approach.
Here are some thoughts to guide you.
#1 – Be careful adopting technology without thoughtfully considering the ramifications.
This technology is game-changing.
It’s not like updating your phone to the latest model. It’s more like how the Internet changed computers or social media changed the Internet.
We’re experiencing a massive leap forward. And we are only beginning to grasp the implications.
To use another metaphor, it’s like a powerful new drug that hasn’t been evaluated or approved by the FDA, and while it may promise to heal, nobody really knows if the side effects would make the cure worse than the disease.
The uncertainty around this tool isn’t necessarily a reason to avoid it, but you should proceed cautiously, not blindly accepting change as positive progress.
When you consider the amazing promises touted by AI tools and the speed in which these tools are wiggling their way into everyday life, and combine that with the uncertainty around how they will truly impact life, there’s a lot of reason for caution.
Just a few weeks after ChatGPT-4 was released, more than 1,000 technology leaders raised significant ethical concerns and noted that there was a lack of planning and management. Their open letter called for a six-month moratorium on releasing more powerful tools.
There are a lot of amazing things, but the more transformational technology looks, the more we should be thoughtful and cautious.
Key Question: Where do we need to exercise caution?
#2 – Anytime you delegate trust, it’s scary.
Pastors and church leaders generally have a set of tired and true resources that have been vetted over time and have proven to be reliable.
When pastors turn to commentaries, resources, or mentors, they do so knowing the scholarship of the work or the character of the authors. And while no extra-Biblical resource is infallible, there is something comforting from a trustworthy and widely accepted source.
When you use tools like ChatGPT to create content, you are delegating a bit of trust to a computer model – a model that very few understand.
With Generative AI, there is still much to understand about how source material becomes outputs and how information is fed to computer models in the first place. There is tremendous concern about bias and discrimination, privacy and data use, intellectual property ownership and copyright, and the potential for misinformation and information.
To be fair, many of these issues exist with other tools, but the depth and power of AI frame these issues in a new light.
One of the biggest issues you may combat as a pastor is your people turning to Chat GPT, Bing, or Bard for spiritual guidance. They may not understand how the models were trained or be able to point to specific sources, but the generated text could become truth to them. It’s a far more sophisticated and personal tool than a search engine and the conversational nature of responses may increasingly become a source of truth for people.
Key question: Where do we need to apply a dose of healthy skepticism?
#3 – There’s not yet a common practice for citing sources, and that’s getting people into trouble.
If you’ve used Chat GPT, you have likely been impressed with the information and the human-sounding response. That’s because the tool isn’t just finding an article and serving the information. It’s using all of it’s training to create a response and delivering that using a natural language model.
Without getting too technical, it’s helpful to look at what the letters and terms of ChatGPT4 means.
The outputs are generated by instantly putting together a natural response using all of it’s training. It’s using sources, to be sure, but you may not always know what those sources are.
And because the responses sound much like a human answering a question, it is easy for people to pass off computer-generated work as their own. Copyright laws simply haven’t caught up with these tools.
The line between human action and AI contributions can quickly be blurry, complicating the issue of authorship or attribution.
If you use AI to write a letter to your congregation, is that letter really from you? If you use AI to write a devotional series based on a recent sermon, are you really the author? What kind of backlash could you face if people felt tricked into thinking you wrote something?
As you think through these issues, you might have lots of other questions. These questions are good, and you should wrestle through them.
- Are you comfortable using ChatGPT for research?
- Are you comfortable using ChatGPT to create sermon or devotional content?
- What sources do you cite and what sources are implied?
- How accepting are you of the content generated by AI tools?
- Are you concerned about how ChatGPT was trained on theological issues?
- Is some kind of peer or pastoral review necessary?
It might be helpful for you and your team to create an internal (and maybe even public-facing) policy to clarify how you will use generative AI. In fact, we’ve created a sample template that could serve as a starting point for your church.
Key question: Does your staff need to think through or communicate how you will use AI-powered tools?