NASHVILLE (BP) – As society continues to wrestle with the complicated issues surrounding artificial intelligence such as ChatGPT, Christians should be ready to engage as the ones holding the ultimate answers, one ethicist told Baptist Press.
Jason Thacker is the chair of research in technology ethics and director of the research institute at the Southern Baptist Convention’s Ethics & Religious Liberty Commission.
He said many of the questions raised by ChatGPT, a form of artificial intelligence designed to provide information to user prompts or questions in a conversational manner, are being asked by both unbelievers and Christians. Yet, the answers often come down to the Christian’s doctrine of the image of God.
“We’re seeing issues in society, we’re seeing fundamental questions being raised and we’re wondering what do we do about it,” Thacker said.
“Our society is also asking those questions and coming to kind of grapple with these fundamental issues, when we legislate or we put in regulations, someone’s morality or understanding of how we are to live is being put into policy and put into practice.
“One of the reasons I advocate for Christians to be involved in these conversations is because they’re happening with or without us. I do think that the Christian worldview gives us a distinct picture and understanding of not only who God is and how He calls us to live, but even more specifically on what it means to be human.”
ChatGPT has been making headlines since it was launched in November 2022 by the artificial intelligence company OpenAI.
Thacker explained it is merely one example of generative AI, or artificial intelligence designed to produce or create data or content. The produced content can come in audio, text, visual or other forms.
Despite the benefits ChatGPT has already provided society, there have been just as many questions raised about the morality or proper use of the new technology.
Questions range from should students use ChatGPT to complete homework assignments to should ChatGPT be used at all?
These issues came to the forefront in a recent congressional hearing where Sam Altman, CEO of OpenAI, addressed Congress about the potential downfalls or weaknesses of the company’s newly launched technology.
Some of the issues discussed include the technology’s potential to eliminate jobs, and Altman’s potential proposal for a national or global agency to help oversee the technology on a large scale to prevent any dangerous developments.
Thacker said though conversations about potential government regulations and the moral responsibility of technology companies are important, Christians must wisely engage this new area of society on their own.
“This isn’t something that we can just look to the government to say you have to fix this or regulate this,” Thacker said. “We do need the government involved … as well as personal engagement not only from the church itself, but society at large and from individuals that we also have a role here.
“This affects all of us. This isn’t something that’s just going to be taken care of by simple government regulation and we kind of move on. No, this is the new world in which we inhabit and we need to cultivate a biblical understand of technology and a biblical vision of how we are to steward these technologies well. All of us need to be involved with this.”
Regarding advice for Christians in handling these issues, Thacker once again appealed to the importance of a strong theological foundation.
“There are many competing visions about what it means to be human,” Thacker said. “Generative AI is causing us to ask questions in light of new things that we hadn’t previously considered.
“It’s absolutely crucial that we have a robust understanding of what it means to be human. They (AI technologies) have attributes of some humans and mimic the attributes of humanity, but they are not human because they are not made in the image of God. God has set us apart from creation and we are spiritual beings.
“Recovering a sense of what it means to be human gives us deep sense of hope and confidence.”