In our last episode, we heard from Lionel Windsor, who talked about his new book, Truth Be Told: Living truthfully in a post-truth world. In this episode, we’re going to focus in on one chapter in Lionel’s book, looking at how, as Christians, we can live truthful, godly lives in a world that is becoming more and more technologically complex.
Technology is not all bad; you listen to this podcast through a variety of technologies. But it’s hard to know how to live in such a technologically complex world. Lionel helps us think through some of the core issues that are vital for us to grasp in a world like ours.
Links referred to:
- Truth Be Told (Lionel Windsor)
- Podcast episode 112: Telling the truth with Lionel Windsor
- Our next event: Embrace AI and lose your soul? How to think about AI as a Christian with Akos Balogh (13 Mar). Watch a short video with Akos and Akos’ AI-generated avatar talk about the event:
Runtime: 25:58 min.
Transcript
Please note: This transcript has been edited for readability.
Introduction
Peter Orr: In a recent episode, we heard from Lionel Windsor talking about his new book, Truth Be Told: Living truthfully in a post-truth world. In this episode, we’re going to focus in on one chapter in that book, looking at how, as Christians, we can live truthful, godly lives in a world that is becoming more and more technologically complex.
Technology is not all bad; you’re listening to this podcast through a variety of technologies. But it’s hard to know how to live in such a complex world—such a technologically complex world. Lionel’s going to help us think through some of the core issues that are vital for us to grasp in a world like ours.
I hope you enjoy the episode.
[Music]
PO: Hello and welcome to the Centre for Christian Living podcast! I’m Peter Orr and I’m joined again by Lionel Windsor. This is the second conversation we’ve had following his book, Truth Be Told. In this podcast episode, we’re going to focus in on one chapter, where Lionel addresses the issue of technology, and we’ll think about technology.
A summary of Lionel’s book
PO: But I’ll start Lionel by asking: can you give us an overview of the book as a whole—particularly for those who might not have listened to the previous episode?
Lionel Windsor: Yeah. Thanks, Pete! It’s great to be here.
The book is called Truth Be Told: Living truthfully in a post-truth world. A good way to think about it is the book is in three parts. In the first part, I’m looking at a whole lot of issues that have to do with truth in our world—what it means to live in a post-truth world. It’s issues to do with—well, I start with politics, but I don’t stay in politics. [Laughter] I talk about technology, culture, and a little bit of the history of Western thought. I talk about our own institutions. Then I talk about our personal lives and relationships. I go through that and I see that we actually have a major truth problem in our world.
Why am I doing that? It’s not just because I want to depress everybody, [Laughter] but because I want to show that the gospel of the Lord Jesus Christ gives us a reason to believe in the truth and a reason to be truthful, as well as a reason to be faithful, in the face of major truth problems that our world and even our own lives have. I look at various parts of the Bible. I look at God, who is truthful and faithful. I look at that in the Old Testament, John’s Gospel, 1 John, Ephesians, 1 Timothy, 2 Corinthians—various places that talk a lot about truth.
Then in the last part of the book, I look at, “Well, what does it mean for us to respond to the truth?” By believing in the truth, by trusting in the truthful God, by repenting and admitting our own sin, and by actually living truthful lives. I’ve got a lot of ideas and thoughts that come from the Bible, rather than just my own head, about living truthful lives and living faithful lives.
That’s what the book is about: it’s a mixture of cultural critique, Bible and practical Christian living all in one package.
Technology and truth
PO: Just before we get into the specifics of technology, how does technology fit into the book?
LW: Yeah. Technology is the second main chapter in my book. One of the big issues in our world when it comes to this post-truth world (that is, a world where people don’t even care about truth) is the issue of how technology amplifies and gives us problems when it comes to being truthful—being truthful with one another and having access to the truth. Technology is a problem. It’s also a great thing, but it’s also a problem. I made it my second chapter because in the first chapter, I just talk about the obvious, which is post-truth politics, and then I go, “Well, actually, the problem is bigger than that.” Technology is one of the facets of our lives where truth is in trouble—especially when it comes to social media and the way in which we interact with one another, the way in which we receive information, the way in which we give information. I want to help people to understand that. Later on in my book, I refer back to the chapter on technology and give some practical ideas about how we can better use our technology and not use it for the sake of truthfulness.
Being an Asimov versus a Shelley
PO: Okay, I’m going to ask you a strange question and you’re going to have to explain the question as well as answer it. [Laughter] Asimov or Shelley?
LW: Okay! [Laughter] Yes. You’re asking me that question because that’s how I start in my chapter.
Asimov: I grew up as an electrical engineer. No, I don’t grow up as an electrical engineer; that’s the wrong way to think about it. I almost feel like I ended up like I did. I grew up as someone who really liked technology.
PO: You were born to be an electrical engineer!
LW: I was born to be an electrical engineer! I really, really liked technology. [Laughter] I loved reading the science fiction of Isaac Asimov. If you know Isaac Asimov—some people do; some people don’t—Isaac Asimov loved robots. He had lots of robots, and the robots were faithful servants of humanity. There was a very, very positive of technology. They had these three laws built into them—the Laws of Robotics. Have you heard of them? You shall not harm a human being, was the first one. Then you obey orders of the human being as long as it doesn’t flip the first law. Basically, robots were designed to be faithful servants of humanity. That was a wonderful vision of technology.
I went to Uni and I became an electrical engineer. I worked in technology—solar energy. I loved computers and still do. I wasn’t an early adopter of social media, but I was close to the earlier edge of that. “I like technology”: that’s Asimov. That’s his vision of technology: wonderful, faithful servant of humanity. Technology’s great. Social media: wonderful, because it will help us to connect.
Shelley: who’s Shelley? Shelley is the author of The Modern Prometheus or Frankenstein. Frankenstein has a monster. We always need to make sure we get this right: Frankenstein is not the monster, but is the doctor who designed the monster. Frankenstein is a Romantic Gothic novel, which was probably technically the first science fiction novel. It was basically about a really negative view of technology: Frankenstein creates this monster. He creates a human being—a robot kind of thing—and this monster ends up destroying him and everybody. It’s a bleak view of technology that’s full of fear.
That’s kind of also true of technology. As I’ve gone through life and as I’ve looked at social media, it’s become a bit more Frankenstein’s monster to me [Laughter] than Isaac Asimov’s robots, because there are these major problems in the way in which the technology does things to us and changes us.
There’s these two visions of technology. Both of them are kind of true, but we have to not be naïve and think, “Oh yeah, technology is wonderful.” At the same time, you can’t be a complete naysayer and say, “Technology is always bad”, because there are all sorts of technologies that we’re using. Right now, we’re recording this using technology and your listeners are listening to it using technology. They’ve downloaded it using the internet and probably found it on social media. I’m glad for that. But we also need to be very careful about technology as well.
The pros and cons of technology
PO: Obviously technology has always been there from the beginning, but the printing press, in many ways, introduced a step change in the way that we use technology to communicate. Do you want to talk about how that was both for good and for ill as well?
LW: Yeah, yeah. That’s the thing about technology: when we talk about technology—when we say the word “technology”—we’re normally thinking of the latest technology. Right now, technology at this point means phones or something. But the printing press was a very important technology. “Technology” is just anything that humans use to extend themselves and to live in their environment.
The printing press: you can overstate it, but we don’t quite realise some of the changes in our relationship to truth that the printing press brought about—in many ways, for good, but also we need to realise that there were some downsides to it.
I’m very glad for the printing press. It made books available at mass production. What that meant was more and more people could have access to books. That was actually great—especially because lots of people could then have access to the Bible and have access to the Bible in their own languages. People could then study the Bible and read the Bible in the original languages, and see that “Ooh, actually, the things that have been passed down by tradition, not all of them are actually right in terms of the church.” In many ways, the printing press was one of the factors that sparked the Reformation in Europe, where people were actually returning to the sources. People were able to read the Bible for themselves, rather than feel like they needed to rely on a human authority. That was really good! Asimov. [Laughter]
But also, people were reading the Bible for themselves, and that’s a good thing—except they’re reading the Bible for themselves. Shouldn’t we be reading the Bible for God—for community? So there was a refracturing: people’s relationship to truth—not just to the Bible, but to truth generally—became more fractured. That was one of the factors that was behind big wars, major conflicts and that sort of thing.
We just need to realise that there’s always good and bad behind technology. Again, I’m very grateful to God for the technology of the printing press. I’m very grateful to God for the technology of the book—the idea of the book that Christians, I think, helped to popularise in the ancient world—the idea of the codex book that you could open up and refer to God’s word. That’s a really good technology! But there are other technologies that aren’t quite so good. I’m thinking more about social media in my book.
Information overload
PO: One aspect of particularly internet-related technology—social media—is information overload. Could you talk a little bit about that?
LW: Hmm, yeah. That’s this just one of them. When you think about the challenges social media gives us, one of them is information overload. That is, what’s really good about social media and the internet is that we’ve got all this access to information at our fingertips. But the problem is there’s so much of it, and no one human being is able to easily process that. So how does that affect our relationship to truth?
What it means is that there’s just so much information, how do you know what’s true and what’s not? I saw something on the internet. I saw it in a TikTok video, or whatever. How do you know it’s actually true? You could say, “Well, you’ve got to research it thoroughly.” How do you research it thoroughly when there’s so much information out there?
Social media companies will come along and say, “We’ve got a solution: we’ll filter it for you. What we’ll do is we’ll set up algorithms that just give you the information that we’ve realised that you want, and we won’t give you the other stuff.” As human beings, we have to do that: we’ve got to filter our information simply because otherwise, we’d just be overloaded by information.
But then, we’re reliant on the algorithms to tell us what the truth is. So when you’re on social media, you get this impression that you’re in the world and that you’re understanding it. People have this weird expression: “Oh, it’s all over Facebook”. It’s not all over Facebook. There’s nothing that’s all over Facebook. It’s all over your Facebook feed, but you think it’s all over Facebook, because that’s the impression Facebook’s giving you—that it’s all over Facebook. It’s not. But you think that’s the world. It’s not; it’s just whatever thing Facebook decided to filter for you.
How are they making that decision? They’re actually not making that decision on the basis of truth; they don’t care about truth. What they care about is profits. They’ll care about what you see to keep you hooked, keep you going and keep you coming back for more, and scrolling and doomscrolling or whatever. That’s what they want because they want to keep you so that they can advertise to you. They’re telling you their algorithms are automatically working out—not what’s true, but what you would like. Then someone in the next room or on the seat next to you on the bus is in a completely different world, being filtered with stuff that they want to read. So is the person at the back of the bus and the other five billion people in the world who are on the same social media platform: they’re all in different universes. We’re all getting different aspects of this truth, but we’re thinking we’re all together. So it’s a very individualising kind of thing.
[Music]
Advertisements
PO: As we take a break from our podcast, I want to tell you about our next event coming up on 13 March 2024 at Moore College. Akos Balogh will be speaking with us about artificial intelligence. AI is obviously being widely embraced across our society. You’ve probably heard of Chat GPT and other AI tools. There’s a lot of concern about how it’s impacting education and other fields. Is it going to get out of control? Is it going to ultimately harm humanity? Should we be alarmed about it?
Akos will help us think as Christians about AI. What does the Bible have to say about how we should think about and use this important technology? How should it, or how might it, affect our faith? We hope that you will join us on 13 March and hear from Akos Balogh, writer and researcher, as he speaks about technology, humanity and theology at this event. Hope to see you there!
And now let’s get back to our program.
Social media and manipulation
PO: I think in the book, you even go as far as talking about social media manipulating us. It’s very easy not to be aware of that and forget that. Do you say a little bit about that?
LW: Yes. Sometimes there’s just the simple algorithmic manipulation, which is the idea by Johann Hari in a book I was reading that was helping us to see that, basically, social media companies are driven by profits, because they’re companies. They’re using the science of addiction to keep us hooked. That’s a kind of manipulation.
But there’s also the more sinister manipulation, which is the deliberate manipulation—not for the sake of profits, which is bad enough, but for the sake of political ends. It’s quite ironic that we’re talking about my book, Truth Be Told: Living truthfully in a post-truth world. My publishers, to their great credit, are very happy for me to say this: the book was going to be printed in a very, very, very large and populous country in the northern hemisphere to our far north—a country that’s very good at manipulating and controlling information. It was going to be printed there. But through their technology, they set up algorithms to find key words. There’s a couple of places—not many—in my book where I talk about this country just as an illustration. The country picked that up and said, “No, you’re going to have to change this, this, this and this.” This was the government. The printer just want to—
PO: For a book that was to be printed in another country and, in one sense, had nothing to do with them.
LW: Yeah, so it was going to be printed in this country, but it was printed for Australia and for us in English. That’s why the book was delayed by three or four months, because we had to find a printer that another country [Laughter] that would actually do it.
I refer to that country, but why am I shy about even mentioning them? [Laughter] You can tell. I’m kind of shy because I’ve actually got it in my head that if I say the word—this five-letter word [Laughter]—then it’s going to be picked up by some algorithm and your podcast might be deleted or something. I don’t know. I don’t think it’s happening; I’m not that far down the conspiracy theory road. But I just don’t want to flag a word [Laughter].
Why am I saying this? This country, and many other countries, are actually deliberately manipulating information.
PO: So it’s not just this country—
LW: Yeah. It’s other countries. It’s not just this country. It’s other ones as well. There are various ways in which social media algorithms are being deliberately manipulated as well, and we don’t quite realise it.
How Lionel uses social media
PO: How does that affect how you use social media, Lionel?
LW: Me personally?
PO: Yeah.
LW: Yeah. I used to use social media a lot. I was more Asimov. [Laughter] What I actually do now is I hardly use it. Even with email: I don’t need to be checking it all day, so I completely silence as many notifications as I can (apart from emergency contacts from people) and I fill my day trying to spend my time with people—the people who are around me, the real people—and with the things that I need to read and follow-up, and I write lectures. Then at the end of each day, I quickly go through my various apps: I’ll have a look at Facebook and anything I have to deal with it there. I finish in two minutes. I look at X/Twitter for a couple of minutes. I check emails and check all the inboxes and that kind of thing. I do that because—this is not so much to do with the manipulation, but it’s got to do with me knowing how addictive it is, and I know what will happen if I check it during the day. Even email: I’ll get caught up in thinking about all the various concerns that are coming in, but also the social media, and I can get very anxious about it all. So I just limit it that way.
I’m not saying everyone should do that, and some people have jobs that don’t allow them to do that. That’s fine. But I try to be a bit of a digital minimalist myself, to use a phrase from Cal Newport. It’s not that everyone has to do that.
PO: But I think what you’ve done is been intentional and you’ve thought carefully about how you use social media and technology. As you say, someone else might think carefully and actually do it differently, but you’ve got to be intentional because the technology is so powerful.
LW: It is. It’s really powerful. We don’t realise how powerful it is, because it’s in our pockets, and we can just open it up and it feels like we’re in control. But very easily, we’re not in control.
A book vs social media
PO: Obviously a book is technology. The internet and social media is technology. I know what the difference is. [Laughter] But what’s the difference?
LW: [Laughter]Yes! Marshall McLuhan was a media theorist who said “the medium is the message”. That is, when we deal with a particular technology, we’re actually being trained to think about truth in a certain way. When you read a book, you’re being trained to think about truth in a certain way—that is, truth is about various complex ideas that all work together from beginning to end. There’s normally either a story, an argument or something that works from beginning to end—something you can flip back and forth. You can pause it, have a think about it and go, “Am I going to take that on board? Am I not?” You can flip back and forth. You can check it out. You can check things out quite quickly—well, not completely quickly. You’re also dealing with truth as a complex but coherent whole, where there’s various ideas that all work together. That’s how a book works. That’s how you end up thinking about truth: if you read books, you end up thinking, “Oh, there’s this side, and then there’s this side,” and those things work together in a rich and complex whole.
Social media trains you to think about truth in a very different way. Social media goes straight for your gut [Laughter]: you flick through a whole series of completely disconnected or non-connected things, and it’s about emotion, emotion, emotion. “That’s funny. He’s sad. I hate her. Oh, that’s awful! Funny bunny rabbit.” You’re just going through and you’re having lots and lots of emotions, and it hits you right in the gut without going through your brain necessarily. When I say, “through your brain”, of course it’s going through your brain, but it’s not going through your rational processing. It’s fast. So it trains you to think about truth in terms of a big, non-connected mosaic of not coherent things. That’s what truth is about. It trains you to think about truth in terms of gut reactions.
You might think, “Oh, that’s not so bad, is it?” But then when you think about how it trains you to think about your personal relationships and the way that you conduct conversation, it’s really, really interesting. When you’re teaching students—students who come in and who are more used to social media—they’re good at finding information quickly, but not necessarily very good at processing or being discerning about it. We’ve got to do some remedial “Here’s how you read a book” stuff at college. They’re better than us in lots of things, but how to read a book—how to grasp the fullness of God’s revelation in all of its complexity, from Old Testament to New Testament—Father, Son and Spirit—and how that works together with sin, justification and sanctification—all those different things, which are all complimentary things that all work together as truth in a complex whole—that is probably better communicated through a book than through a medium that is just hitting you with lots and lots of little, short things.
Artificial intelligence
PO: In the book, you touch on AI, but you don’t spend a lot of time on it. I’d love to hear your thoughts on artificial intelligence [Laughter]and the latest iteration of Chat GPT and things like that.
LW: Yeah. This is funny: there was probably going to be about 14 months between submitting the book and it being on the shelves. I submitted the book just a week before Chat GPT came on the scene. I’d had mentioned AI, so I was glad I’d mentioned it, but I think it was just one paragraph or wherever. Same with virtual reality.
AI is the next iteration that we really need to be thinking about. Again, it’s Asimov’s robot and Mary Shelley’s Frankenstein. There’s really good things about it: it’s incredibly powerful, and I’ve been experimenting with it, using it to do various things. But at the same time, the relationship to truth is going to be very, very … How do you know whether what you’re reading or seeing has been created by a bot, or is actually the person that’s being represented to be saying? That’s true of text, but it’s also true of images and video.
It’s really interesting: I was talking to someone who’s not a Christian the other day and they were saying to me, “Well, if God really cared about us, wouldn’t he give us some evidence for Jesus’ resurrection that was just more obvious—like an actual image or video or something that’s immediate that we can see, because seeing is believing?” I was going, “Actually, God’s given us evidence that’s more reliable, and that is historical evidence for the resurrection that you can discern, look at and weigh up.” Seeing is not believing anymore: what you see, you don’t know if it’s been made up. So seeing something on a video, seeing words on a page that have come from the internet—that’s going to be a significant issue for us.
It’s a little bit like nuclear technology: it can do great things, but it can also be used to create nuclear weapons. That’s true of this. It’s no surprise that the a modern successors to Mary Shelley’s Frankenstein are AI-based: The Matrix movies, the Terminator movies, Bladerunner and everything else—they’re dystopian ideas based on AI. [Laughter] People have been predicting this for quite some time, but now it’s upon us!
I’m very glad that we trust in the Lord Jesus Christ, who is the way, the truth and the life (John 14:6)—that he’s risen from the dead and that he is true, because we’re entering a world where it’s going to be difficult to know what is true.
Not to mention—and this is a whole other area—issues of pornography and the way that pornography now is so incredibly available to people that it’s actually changing the way we think about relationships, and the way we think about men, women and sexuality. It’s destroying people’s brains, really. That’s already been happening for quite some time.
I’m very negative about this, aren’t I. That’s why it’s only in chapter 2 of my book [Laughter]. I get more negative after that. But as I say, that’s not the only problem. Then I bring in the gospel of the Lord Jesus, which I talked about on the last podcast episode.
PO: The whole book is on truth, and I think you’ve shown us so helpfully how technology can be used wonderfully for good, but I guess it amplifies human sinfulness. So as Christians, it’s very, very important that we think carefully and clearly about the truth. We have the truth: we have God’s word. We have the Lord Jesus.
Conclusion
PO: Thank you very much again, Lionel, for coming on the podcast. And thank you for writing a book that is so timely and important in the world we live in. Thanks Lionel!
LW: You’re welcome, Pete. Great to be here.
Conclusion
[Music]
PO: To benefit from more resources from the Centre for Christian Living, please visit ccl.moore.edu.au, where you’ll find a host of resources, including past podcast episodes, videos from our live events and articles published through the Centre. We’d love for you to subscribe to our podcast and for you to leave us a review so more people can discover our resources.
On our website, we also have an opportunity for you to make a tax deductible donation to support the ongoing work of the Centre.
We always benefit from receiving questions and feedback from our listeners, so if you’d like to get in touch, you can email us at ccl@moore.edu.au.
As always, I would like to thank Moore College for its support of the Centre for Christian Living, and to thank to my assistant, Karen Beilharz, for her work in editing and transcribing the episodes. The music for our podcast was generously provided by James West.
[Music]
Bible quotations are also from THE HOLY BIBLE: NEW INTERNATIONAL VERSION®. NIV®. Copyright © 1973, 1978, 1984, 2011 by International Bible Society, www.ibs.org. All rights reserved worldwide.
Photo by Alex Knight on Unsplash