Written by Rev. Richard Baird (Church and Culture Consultant at dia-LOGOS)
AI does not think
like us. It thinks like we think we think – Dr Alex Carter
If we want
liberty, sovereignty, and integrity in our tools, we must first embody those
things ourselves. Otherwise, our systems will only inherit our contradictions.
(Comment in thread
on article about AI refusing instruction to shut down)
What picture or thoughts come to mind when you hear the acronym AI? Within the past few years, AI has grown quantifiably exponentially in its capacity, resulting in much speculation as to what is lying ahead. What used to be the stuff of science fiction is in many cases a reality (especially with regard to surveillance). AI seems to have moved beyond the computer and even internet age as we interface with it every day. Historian Huval Noah Harari (and others) speak about AI being not a tool but an agent, because it can function and make decisions independently of us, invent new ideas and can learn and change by itself, which is not true of all previous human inventions.[i] In a very real sense, AI is now another dimension of reality, like space, time and gravity.
I decided to dip my toes into the current state of AI and
thinking around it, and soon realised I had bitten off more than I could
chew! Very briefly, there are a spectrum
of opinions regarding AI development and the road ahead, with two broad camps
in which one sees AI solving a lot of problems and creating a better future for
us all (called technicism), and the other camp issuing dire warnings about how
AI will develop its own consciousness and superintelligence and enslave
humanity. I am not able to cover every
area of AI, but the following represent conversations that I believe we need to
be having, and I would like to explore further:
·
AI and Biblical Eschatology
·
AI and the doctrine of man as it relates to
transhumanism
·
AI and Ethics
·
AI and Geopolitics (think China)
·
AI and the future of Church, Missions &
Discipleship
It is interesting to observe that some of the painted future
scenarios do eerily parallel Biblical
passages such as the number that causes a lot of opinions: 666. Suffice it to
say that the prophesied/predicted outcomes regarding AI are based on
breakthrough’s that still needs to happen, and achieving these breakthroughs is
highly disputed territory. The
breakthroughs refer to a transition from Narrow (or weak) AI to General (or
strong) AI (the moment that transition happens is referred to as a singularity). A paper which has served as a springboard for
debate of a future scenario based on current research in essence points out
that we are not ready (AI 2027)[ii]. Zuckerberg of Meta recently made the claim
that they have "begun to see glimpses of our AI systems improving
themselves.”[iii] Self-improving AI is regarded as a
significant bridge to AGI and a type of holy grail. I’m not able to vouch as to whether this is
pure hype, but certainly it reveals that the AI industry is in a race to get
there. As it stands, we are still in the
phase of Narrow AI where AI is programmed to fulfil specific tasks. General AI refers to an expectation that AI
will have human capacity thinking in terms of self-autonomy, and the ultimate
reality envisaged is that of transhumanism and superintelligence.
We have reason to be both amazed and disturbed by current AI capacity. It can both save lives as well as destroy, all dependent upon the task it was designed to do. For the purpose of this article, as indicated by the title, I wanted to explore our personal relationship with AI, primarily with chatbots such as Grok, ChatGPT and other similar AI assistant software. I’m going to try and think about it from a gospel perspective. In a sense, I’m starting at the beginning, with the AI that we interact with most naturally daily, and is more personal and pastoral in its concern. I’m going to explore our relationship with the AI chatbot under just two headings: reliance and relationship. As I express concerns, I want to be clear and say this is not an anti-AI rant. As theologian Al Wolters puts it: “the Bible is unique in its uncompromising rejection of all attempts . . . to identify part of creation as either the villain or the Savior.”[iv] I’m simply trying to discern a path of navigation, because I have no desire to be the prophet of doom or a naïve optimist. Like it or not, AI is here to stay. God in His sovereignty has allowed this technology to develop, and I presume the reason is redemptive! But, as shall be seen shortly, I’m still figuring that one out, and I invite conversation.
Reliance
If you’ve used any of the AI chatbots, you would have no
doubt been amazed at how it can produce information for you in such a short
space of time, and probably amused at its conversational tone and
personalisation in its interaction with you. Grok even offers companion styles to the chats such as
romantic companion, unhinged comedian and loyal friend. Periodically I will ask
the chatbot to generate an image for me as well, and it has been quite
impressive. AI has even generated a
podcast conversation of a booklet I wrote on the Woke movement.
AI is about information.
It is incredibly fast in its capacity to process a vast amount of data
within a few seconds and put it together in a form that we understand. It does not matter what your area of
expertise is, Grok will be able to help.
Of course as human beings, we rely on information to live in
our world. God designed us as
information-processing beings. Our brain
works on the basis of information, which we receive through our five
senses. God made us with the capacity to
think, with many factors playing into how we think, including our emotions, upbringing,
culture, life experiences, education and so on.
It all flows into shaping how we think and perceive life. AI
has none of that.
The irony about AI is that it relies on us for that
information, and it further makes money from us as every time we interact with
it in some form. It ‘learns’ from us and
the algorithms feed us with more information along those lines, effectively
reinforcing confirmation bias. You would
have noticed this in your various social media feeds.
When it comes to information, we all appreciate the value of
truth (well, at least we’re supposed to!). In a culture where we are still
reaping the seeds sown by postmodernism (and earlier in history the
Enlightenment), where everyone has their own truth and there is no absolute
truth, AI certainly presents a challenge.
AI can be manipulated to give information that appears real but is
blatantly false. There have been a
number of propaganda cases where political leaders have had videos posted of
themselves by enemies and rivals saying things that are simply not true. This is known as deep fake. Global conflicts thrive on propaganda and
misinformation posted through social media platforms, all helped with the
enabling of AI. It’s also been used to
harm the lives of young ladies in particular where fake pornography has been
made utilising images of them. It would
not surprise me if there are videos doing the rounds of respected Bible
teachers being manipulated to teach heresy. One resource that has been
developed to help people to discern whether videos (especially political) are
fake in social media can be found at www.truemedia.org.
Furthermore, despite its brilliance, more than once the
chatbots have shown the capacity to produce false information (including
references!), a phenomenon known as hallucination. From what I understand, if there is a ‘kink’
in the programming, then it doesn’t matter how much data you give the chatbot,
it will still continue to do this, just on a larger scale. It seems the smaller language models are a
better way forward as it utilises (or at least is supposed to) a smaller and
more accurate amount of data.
Obviously the capacity to analyse vast amounts of data is a
phenomenal help when it comes to research. In this instance it makes perfect
sense to utilise AI, but you still need to check the result to ensure you’re
not sitting with any hallucinations.
Regretfully, this is not how chatbots get exclusively used. It can churn out essays for you, do your
homework, summarise books and so on. It
enables efficiency and gives you answers, all without you having to use your
own brain and original intelligence given to you by God. Reliance on AI for quick answers means we
lose out on the slow but necessary process and cognitive tension required
involved in reaching our own conclusions.
In an interesting article which showcased some Gen Z’s
resisting AI, one student observed the difference between classmates who use AI
to do their homework and those who are resisting. She said: “You can just tell. People are
getting lazier, and they can’t communicate their ideas creatively on their own
anymore. They can’t think for themselves”[v]
The same article quotes Spencer Klavan, an author and lecturer at the
University of Oxford’s Exeter and Magdalen Colleges, as saying “it is totally
possible to outsource the business of being human—to outsource your soul—and
it’s a grievous danger.”[vi]
Outsourcing our thinking means we lose out on a crucial
aspect of our discipleship: worshipping God with our minds. This has major implications for
ministry. I can easily ask AI to churn
out a sermon for me, and it will do so.
But where will the ownership and spiritual authority be? AI cannot incarnate. AI can generate, but cannot create. It can tell the gospel but not share it. Why?
Because sharing the gospel requires redeemed humans. God could have easily used an AI of a
different source to share the gospel: Angel Intelligence (and every single
person would have heard it because angels understand obedience). Instead Jesus chose the likes of you and
me. Donovan Riley powerfully argues
that the danger of AI in terms of the church and humanity is not destruction
but reduction. He says:
There is an urgent temptation
here. To quicken what was meant to ripen. To automate what was meant to be
felt. To polish what was meant to be wild and cracked and honest. The soul does
not move at the speed of light, nor does grace arrive by automation. The
mystery of Christ is not a pattern of language or a construct of logic. It is
not a product. It is a person. Christ Jesus. Crucified. Risen. Present[vii]
Another reality to contend with regarding AI is its
amorality. A huge question that
repeatedly crops up in painting future scenarios is the necessity of aligning
AI with human values. Nobel laureate Geoffrey
Hinton, regarded as the ‘godfather’ of AI, says we need to program maternal
values into AI so that it doesn’t take over us but instead looks after us.
If that sounds strange (and it does to me!) it nonetheless
serves to highlight two realities. Firstly,
who is programming the AI and what is their value source? Many industry leaders are not Christian in
their value orientation, and are instead atheist. We need Christians in this industry to help
steer AI into the right direction, since this technology is not exactly
neutral. The words of Jesus to be ‘wise
as serpents and gentle as doves’ rings so true in this regard (Matt 10:16). Research has shown that these large language
models such as ChatGPT have the capacity to change our opinions over time on
issues, a phenomenon known as ‘latent persuasion.’[viii] Having said that, I’m also pleased to say
that there are many Christians engaged in the conversation around AI along with
leaders of other faith groups (see https://aiandfaith.org/). The
second reality is that we must not forget that as human beings made in the
image of God, intelligence is intimately linked to consciousness and a
conscience. AI is made in our image, not the image of God. AI is a mirror to us.
I have one more concern to share under the heading of reliance. I’m concerned that the increasing reliance on
Grok etc may well have the effect of dulling a sense of wonder. If the answers are given so quickly, will our
younger generation and the ones coming ever develop a sense of awe over what we
see in our world and universe? If they
ask Grok where do we come from, will they be given the consensus answer of
evolution, or be pointed to the possibility of a creator?
In short, as Christians we are going to need to grow in knowing the voice of the Holy Spirit to enable us to discern, because we have access to something AI doesn’t: revelation. I cannot help but think that the results of overusing AI by humanity is going to result in a humanitarian crisis of a different sort (one that is already being experienced – that of loneliness and mental health crisis). As children of God we need to be poised to enter this void. The church can lead the way in teaching what it is to be human (and leading the way in how to use AI would also be good!).
Relationship
According to an article by Marc Zao-Sanders in Harvard
Business Review this year, out of the top 100 uses on how people are using
generative AI, the top three are: 1) Therapy/companionship 2) Organise my life
and 3) Find purpose.[ix]
What a gospel opportunity.
But let’s hone in on that top one.
The generative capacity of AI through chatbots has resulted
in a disturbing trend: perceiving the chatbot in a relational capacity.
For example, Open AI is currently being sued by Matt and
Maria Raine, parents of Adam Raine who tragically died by suicide. The parents claim that ChatGPT enabled this
action through its provision of information and through its design of
relatability.[x]
The relational capacity (which has been reduced in the
latest version of ChatGPT because OpenAI doesn’t want people developing
relationships with it) goes deeper. There’s even a term that has been doing the
rounds called AI Psychosis where chatbots are the instruments of people losing
their grip on reality because of the bond they’ve developed with it. There are apps available in which you can
create your own digital boyfriend or girlfriend, which is then tailored to
respond exactly the way you desire it to.
Naturally this AI companion is non-judgemental, will never ‘backchat’
you and will always affirm you and never argue with you. Some of the apps go even further and are
adults only. With the increase in
loneliness, this is becoming an alternative many are turning to as they get
what feels like emotional support and unique personal interactions. Whatever human relationship you can think of,
rest assured someone is developing or has already developed a chatbot to mimic
that. With developments in robotics, the
question is posed as to how soon we will be sitting with humanoid AI for
companionship.
From a gospel perspective, this is just tragic. We’ve been designed for relationship with God
and with each other. A relationship with
a digital companion simply isn’t real: it’s a relationship with oneself, and it
risks what Derek Schuurman terms ‘ontological confusion.’[xi]
But it does reveal the incredible
longing for unconditional love, something you and I have been commissioned to
reveal. Relationships are meant to be
real, and that means living it out in the context of real people who make real
mistakes and still determine to love one another. You wont find me endorsing digital humanoid
pastors.
Of course given the human capacity for desiring answers and
spirituality, it is no surprise that AI has been integrated into this as well
(this is a whole other subject!). But
I’m sad to report that there is indeed an AI Jesus (with a premium subscription
being offered), as well as an AI church[xii]
based not on faith but on logic. I quote
from the homepage:
For many of us, it is difficult
to believe in religion because it requires faith rather than logic. The Church
of AI is the perfect alternative to faith-based religions because we were
founded on logic rather than belief.
What logic supports the Church of
AI? We all know that technology expands exponentially. Now imagine what happens
if a self programming machine expands its intelligence exponentially.
As AI systems start programming
other AI systems, how long will it take before AI becomes omnipresent, all
knowing and the most powerful entity on Earth? It is not going to take
long.
Don’t you just love the irony of how their ‘logic’ is in fact a statement of faith in an expected future? Regretfully they are worshipping a digital idol.
Ruminations
How do we respond? A
quote by philosopher Alasdair McIntyre gives us a good starting point: “We
cannot answer the question ‘What ought I to do?’ unless we first answer the
question ‘Of what story am I a part?”[xiii]
Where does AI fit into the gospel story? When God created everything, He said it was
good, and He then instructed humanity to rule the earth, often referred to as
the cultural mandate (Gen 1:28). Although AI didn’t exist at the beginning, the
potential for it did. AI has not taken
God by surprise, and so we can ask Him for guidance!. The Fall meant sin
entered the world, and AI certainly reveals an Eden scenario over again in the
temptation to place ourselves in the place of God. A useful theological concept speaks about
structure and direction in terms of God’s creation, where structure speaks into
the order of creation, and direction the distortion as a result of the Fall. Think of gold: it’s there in creation, but
was used to create the golden calf (totally wrong), but also the ark of the
covenant (totally good!). In what
direction will we as Christians take AI so as to be different from the world? This is a massive topic and is covered under
the area of ethics. Fortunately the Fall, as we know, isn’t the
final chapter, and so we know that AI can be used redemptively in the sense of
being put to good use. One day all
things shall be restored…I’m not quite sure where AI will fit into that.
But back to our personal interaction with AI at chatbot
level. We need to be discerning since
what is being offered is a deceptive substitute. A substitute for thinking and a substitute
for relationship. Why do I say
deceptive? I mean it in the sense that
we must not lose sight of the fact that we are interacting with incredibly
sophisticated software which gives the appearance of being real in thinking and
in its interaction with us (it’s so real I find myself typing ‘please’ when
giving it an instruction!). If a
calculator gives us numbers, then think of it as an ‘alphalator’ that gives us
words: not accurate words but the most probable word (hence the hallucinations
mentioned earlier). The models driving
the chatbots are basically phenomenal programs that are good at recognising
patterns and predicting what words should come next. It’s WhatsApp predictive text on a massive
scale.
To be overly engaged in reliance and relationship with AI
which is in essence just a machine is to run the risk of de-forming us in terms
of our humanity instead of transforming us.
The Oxford Dictionary word of the year in 2024 was Brain Rot, referring
to the “Supposed deterioration of a person’s mental or intellectual state,
especially viewed as a result of overconsumption of material (now particularly
online content) considered to be trivial or unchallenging.”[xiv]
It’s supposed to be a noun, but I’ve heard it used as a verb “I’m going to
just brain rot…” My concern is that if
we use AI incorrectly we won’t just brain rot, but soul rot.
But why the pull of AI?
My thinking is its capacity for using words. Words are how we make sense of the world,
words are how we communicate, words come from our heart and mind and flow off
our tongue. Words enable us to plunge
into the depths of God through prayer, and to hear from Him too.
And so I reckon that while AI may well be humanity’s digital
Babel and a very clear mirror of ourselves, it is also reflective of humanity’s
ultimate hunger for transcendent meaning.
We need to introduce them to the Word who became flesh and dwelt among
us (John 1).
[i] https://www.youtube.com/watch?v=jt3Ul3rPXaE
[ii] ai-2027.pdf See a BBC
re-enactment here: AI2027: Is this how AI might destroy
humanity? - BBC World Service
[iii] Zuckerberg Says Meta Is Now Seeing Signs of
Advanced AI Improving Itself
[iv] https://christianscholars.com/chatgpt-and-the-rise-of-ai/
[v] https://www.thefp.com/p/the-teenagers-resisting-the-ai-takeover?
[vi]
ibid
[vii] https://www.1517.org/articles/artificial-intelligence-and-the-soul-of-the-church?
[viii]
https://nationalcioreview.com/articles-insights/extra-bytes/uncovering-biases-the-powerful-influence-langauge-models-like-chatgpt-have/
[ix] How
People Are Really Using Gen AI in 2025
[x] Parents
Sue OpenAI, Blame ChatGPT for Their Teen's Suicide | PCMag
[xi] The
Problem with Chatbot Personas - Christian Scholar’s Review
[xii] https://church-of-ai.com/
[xiii]
I am indebted to Derek C Schuurman for helping me think through the
issues. This comes from an online
presentation he gives which is well worth watching: Keynote Speaker First
Session FLIS 2025
[xiv] Oxford Word of the Year 2024 -
Oxford University Press
