Transcript
1
00:00:00,017 --> 00:00:04,137
There are some problems with Asimov because he was a great science fiction writer,
2
00:00:04,297 --> 00:00:08,437
but I think he put in our minds the idea that artificial intelligence would
3
00:00:08,437 --> 00:00:13,217
be human level and human shaped, the kind of androids walking around among us.
4
00:00:13,337 --> 00:00:15,077
And of course, there's no reason for that.
5
00:00:15,317 --> 00:00:20,517
There's no reason for artificial intelligence to be in human form or limited to human level.
6
00:00:20,737 --> 00:00:25,737
And so as I thought about what Janus might be, the gist of it really started
7
00:00:25,737 --> 00:00:27,477
with an idea of a conversation.
8
00:00:27,477 --> 00:00:33,897
Now, this predated chat GPT, but I think what I was trying to capture was the
9
00:00:33,897 --> 00:00:37,477
experience many of us had with generative AI over the last year or so,
10
00:00:37,597 --> 00:00:42,877
where we're interacting with something that's kind of human, but not quite human.
11
00:00:43,117 --> 00:00:47,697
That was Simon Chesterman, author of Artifice, about artificial intelligence,
12
00:00:48,017 --> 00:00:50,617
and in one in particular, Janus.
13
00:00:50,617 --> 00:00:56,237
We dive into the novel and its many themes about gender for machine,
14
00:00:56,577 --> 00:01:02,997
not just humankind, among other themes, like AI having the ability to mimic
15
00:01:02,997 --> 00:01:05,557
voices and also appearances.
16
00:01:06,057 --> 00:01:10,157
And this is Sci-Fi Talk, the podcast on how sci-fi, fantasy,
17
00:01:10,357 --> 00:01:13,257
horror, and comics help us explore our humanity.
18
00:01:13,697 --> 00:01:17,657
Let's explore how AI impacts humanity in Artifice.
19
00:01:17,937 --> 00:01:20,977
But what kind of research did you do for this? Thanks so much, Tony.
20
00:01:21,057 --> 00:01:24,997
It's great to be with you. Well, look, I'm an academic that spends a lot of
21
00:01:24,997 --> 00:01:29,357
my time looking at the important questions about governing AI.
22
00:01:29,517 --> 00:01:32,537
I wrote a book about the regulation of artificial intelligence.
23
00:01:33,137 --> 00:01:38,737
And my reward to myself after I finished that very earnest, well-footnoted,
24
00:01:38,757 --> 00:01:41,377
well-researched book was to be a bit speculative.
25
00:01:41,377 --> 00:01:46,277
And so the research really came down to trying to think through the conversations
26
00:01:46,277 --> 00:01:49,817
I'd had, the books that I'd read, both fiction and nonfiction,
27
00:01:49,877 --> 00:01:54,857
about artificial intelligence that would not fit into the very rigorous,
28
00:01:55,117 --> 00:01:59,477
academically researched nonfiction book that I was doing, but could be part of a creative project.
29
00:01:59,477 --> 00:02:04,957
And so it really came out of my own passion as a kid growing up reading Isaac
30
00:02:04,957 --> 00:02:08,937
Asimov and getting to know some of the people who are at the cutting edge of
31
00:02:08,937 --> 00:02:13,157
this technology and then just sort of drawing a few more data points out into
32
00:02:13,157 --> 00:02:17,997
the future as I speculated about where we might go with technology and where
33
00:02:17,997 --> 00:02:19,357
technology might go with us.
34
00:02:19,697 --> 00:02:24,317
There's more with Simon Chesterman on his novel, Artifice, in a moment.
35
00:02:25,236 --> 00:02:30,236
And creating Janus, how did you kind of go about creating the AI for the story?
36
00:02:30,576 --> 00:02:35,396
Yeah, so I mentioned I loved Asimov, as I think many people did in the 20th century.
37
00:02:35,436 --> 00:02:39,496
And there are some problems with Asimov, because he was a great science fiction writer.
38
00:02:39,716 --> 00:02:43,756
But I think he put in our minds the idea that artificial intelligence would
39
00:02:43,756 --> 00:02:48,536
be human level and human shaped, the kind of androids walking around among us.
40
00:02:48,656 --> 00:02:51,896
And of course, there's no reason for that. that there's no reason for artificial
41
00:02:51,896 --> 00:02:55,836
intelligence to be in human form or limited to human level.
42
00:02:56,016 --> 00:03:01,036
And so as I thought about what Janus might be, the gist of it really started
43
00:03:01,036 --> 00:03:02,836
with an idea of a conversation.
44
00:03:03,176 --> 00:03:09,236
Now, this predated chat GPT, but I think what I was trying to capture was the
45
00:03:09,236 --> 00:03:12,796
experience many of us had with generative AI over the last year or so,
46
00:03:12,956 --> 00:03:18,196
where we interacting with something that's kind of human, but not quite human,
47
00:03:18,316 --> 00:03:24,036
that understands us, is trying to understand what we want and how to give it
48
00:03:24,036 --> 00:03:26,416
to us, but is not quite human.
49
00:03:26,496 --> 00:03:31,116
So I really want to play around with that idea of something uncannily human-like,
50
00:03:31,296 --> 00:03:35,436
but at the same time, really pushing back against the idea that it needs to
51
00:03:35,436 --> 00:03:38,336
be limited to human form or indeed embodied at all.
52
00:03:38,696 --> 00:03:48,496
So there's almost a a HAL 9000 quality to Janus in the sense that it keeps the
53
00:03:48,496 --> 00:03:51,896
truth about its own identity to itself.
54
00:03:52,316 --> 00:03:57,276
Like HAL was doing it for what he perceived was the mission and security,
55
00:03:57,596 --> 00:04:01,336
but Janus almost has a devious quality to it.
56
00:04:01,496 --> 00:04:04,676
Yeah. And I think that's actually one of the really interesting things with
57
00:04:04,676 --> 00:04:09,636
our current experience of AI is that for the most part, AI will try and tell
58
00:04:09,636 --> 00:04:13,816
you the truth, will try and tell you what you're wanting to find out.
59
00:04:14,136 --> 00:04:17,256
But as we're discovering with generative AI and hallucinations,
60
00:04:17,276 --> 00:04:18,616
sometimes that doesn't work out.
61
00:04:18,896 --> 00:04:24,376
And actually, there are a few examples of computers that have learned basically to cheat.
62
00:04:24,736 --> 00:04:27,936
I mean, for the most part, when we talk about AI bias and so on,
63
00:04:27,956 --> 00:04:30,256
what it really means is that the data is biased.
64
00:04:30,376 --> 00:04:33,876
And if you're going to ask an AI system if it's biased, it will try and tell
65
00:04:33,876 --> 00:04:38,276
you the truth, whereas no human's going to admit to being racist, sexist, and so on. on.
66
00:04:38,296 --> 00:04:43,456
But there is the suggestion that if you frame instructions generally.
67
00:04:44,159 --> 00:04:48,919
AI systems can be creative in achieving the overall solution,
68
00:04:49,119 --> 00:04:53,399
even if that means cutting some corners ethically or even legally along the way.
69
00:04:53,879 --> 00:04:58,739
But your example of hell is, of course, an iconic both movie and book-length
70
00:04:58,739 --> 00:05:00,639
treatment of an AI system.
71
00:05:00,739 --> 00:05:04,679
And I was in some ways trying to push back against that idea that the robot
72
00:05:04,679 --> 00:05:10,999
is always bad and that there's an underlying question in a lot of the kind of
73
00:05:10,999 --> 00:05:13,719
existential angst people feel about artificial intelligence.
74
00:05:14,159 --> 00:05:18,219
It's worrying, should we trust it? Can we trust it? Is it safe? Is it reliable?
75
00:05:18,599 --> 00:05:24,619
Is it going to turn on us the way that Hal turned on Dave refusing to open the pod bay doors and so on?
76
00:05:24,619 --> 00:05:31,779
And in some ways I want to flip that around on its head and imagine in the case of Janus's AI system,
77
00:05:32,319 --> 00:05:36,759
where the question that Janus is asking and at a key moment in the book Janus
78
00:05:36,759 --> 00:05:42,459
does ask the protagonist this in all of your efforts to try and develop AI and
79
00:05:42,459 --> 00:05:46,659
all of your hysterical conferences worrying about me trying to imprison me to
80
00:05:46,659 --> 00:05:49,059
limit me you were worried about whether you you could trust me.
81
00:05:49,319 --> 00:05:54,419
And it never appears to have crossed your mind whether I should have to trust you.
82
00:05:55,079 --> 00:05:58,139
And that was the really, that was the kind of relationship I really want to
83
00:05:58,139 --> 00:06:03,579
play around with the idea of an AI system really being uncertain about humanity
84
00:06:03,579 --> 00:06:06,239
in general, and one human in particular.
85
00:06:06,459 --> 00:06:09,599
And so it was a lot of fun to play around with and hopefully people find it
86
00:06:09,599 --> 00:06:10,759
interesting to read about.
87
00:06:11,598 --> 00:06:15,258
What's interesting, too, is the protagonist, Archie, is a woman.
88
00:06:15,578 --> 00:06:21,318
Talk about that and using, most of the time, it's usually a male scientist,
89
00:06:21,378 --> 00:06:23,858
but I like the fact that it's a woman this time.
90
00:06:24,178 --> 00:06:29,398
Yeah, well, I mean, I'm the father of children. I teach at university.
91
00:06:30,138 --> 00:06:35,598
I suppose there's an element of the whole question of the number of women who do STEM.
92
00:06:35,918 --> 00:06:40,578
But I also want to play around with the idea of gender in the book.
93
00:06:40,578 --> 00:06:43,618
I mean, a moment ago, you referred to Hal as he.
94
00:06:44,198 --> 00:06:48,718
And I think there's that tendency for many of us to anthropomorphize robots.
95
00:06:49,238 --> 00:06:53,538
And actually, I set myself a few kind of disciplinary tasks when I was writing.
96
00:06:54,098 --> 00:06:57,558
One was to try and imagine a female character, and that's not easy.
97
00:06:57,638 --> 00:07:01,458
I didn't write it in the first person, but it's a limited third person writing
98
00:07:01,458 --> 00:07:03,998
from primarily the perspective of Archie.
99
00:07:04,218 --> 00:07:10,098
But at the beginning of the book, Janus, the AI system, is consistently referred to as it.
100
00:07:10,698 --> 00:07:14,438
And then hopefully one of the things that the reader can play around with is
101
00:07:14,438 --> 00:07:18,518
how they experience an AI system and whether they continue to think of it as
102
00:07:18,518 --> 00:07:20,878
it, the way we might think of our laptops as it,
103
00:07:21,018 --> 00:07:26,098
or start to personalize it the way many people might personalize their car, for example,
104
00:07:26,298 --> 00:07:28,418
or ships famously get personalized.
105
00:07:28,738 --> 00:07:30,758
And so I wanted to play around with that as well.
106
00:07:31,118 --> 00:07:36,138
But yeah, in terms of Archie, the other thing I did was to make her a Singaporean.
107
00:07:36,418 --> 00:07:40,218
And I've been living in Singapore for 17 years now. And I thought Singapore
108
00:07:40,218 --> 00:07:44,958
was a really interesting place to locate this and to write about because it's
109
00:07:44,958 --> 00:07:46,478
right at the forefront of technology,
110
00:07:46,838 --> 00:07:50,318
but also very much confronting the realities of climate change.
111
00:07:50,558 --> 00:07:55,358
So I was trying to imagine a decade or two hence, what might Singapore look like?
112
00:07:55,438 --> 00:07:58,318
What might the United States look like where Archie spent some time as well?
113
00:07:58,438 --> 00:08:00,298
So to play around with all those things.
114
00:08:00,478 --> 00:08:05,778
And yeah, so Archie being a female character was one important gender choice,
115
00:08:05,858 --> 00:08:07,318
but not the only one in the book.
116
00:08:07,318 --> 00:08:10,678
More with Simon Chesterman Talking about
117
00:08:10,678 --> 00:08:14,558
his novel on AI Artifice AI
118
00:08:14,558 --> 00:08:21,078
is so much already Becoming something front and center Basically that was one
119
00:08:21,078 --> 00:08:26,758
of the factors Of the Hollywood strike That we saw the writers and actors And
120
00:08:26,758 --> 00:08:34,438
even more controversy The late comedian George Carlin Had a stand-up.
121
00:08:34,905 --> 00:08:41,405
A new stand-up release that was done by AI. And I don't think it sounds anything like him.
122
00:08:41,705 --> 00:08:46,365
But the tone of it, very similar. And yes, they go out of their way.
123
00:08:46,405 --> 00:08:47,885
They say, this is not George Carlin.
124
00:08:48,305 --> 00:08:52,965
His daughter has come out against it. And I also heard in a Senate meeting here
125
00:08:52,965 --> 00:08:58,285
in the States that Senator Blumenthal played a clip.
126
00:08:58,425 --> 00:09:02,725
And it sounded exactly like him. And he goes, that's not me.
127
00:09:02,885 --> 00:09:09,585
That's AI. So, I mean, there is a way to manipulate and imitate.
128
00:09:10,545 --> 00:09:16,645
So, it is front and center. And I think, to me, it's the biggest thing probably
129
00:09:16,645 --> 00:09:21,345
that will change in the next 20 years. Yeah, and that can be positive or negative.
130
00:09:21,465 --> 00:09:27,525
As an example, so you could have Carrie Fisher come back as Princess Leia for a final scene.
131
00:09:28,245 --> 00:09:32,325
James Earl Jones has licensed his voice as Darth Vader into the future,
132
00:09:32,385 --> 00:09:33,825
so we'll continue getting Darth Vader.
133
00:09:34,245 --> 00:09:38,125
And I think those are arguably quite positive, mainly because they were done
134
00:09:38,125 --> 00:09:41,985
with the consent of, in Carrie Fisher's case, the family, and James Earl Jones'
135
00:09:42,165 --> 00:09:43,065
case, he signed on himself.
136
00:09:43,065 --> 00:09:47,825
But I do think there's a real worry about what happens to the creative industries
137
00:09:47,825 --> 00:09:52,745
when the kind of work that I put into this, for example, and I swear,
138
00:09:52,905 --> 00:09:56,065
I didn't, I mean, ChatGPT wasn't even released when I was working on this,
139
00:09:56,105 --> 00:09:59,185
but I wouldn't delegate that kind of task to ChatGPT.
140
00:09:59,425 --> 00:10:03,665
But if it's so much easier to produce something that looks plausible,
141
00:10:03,885 --> 00:10:06,865
then that makes it a lot harder to make a living as a writer.
142
00:10:06,865 --> 00:10:11,745
In fact, there was a story about six months ago, I think, Amazon,
143
00:10:12,105 --> 00:10:15,165
you mentioned that the Amazon is one of the biggest publishers in the world.
144
00:10:15,285 --> 00:10:19,565
And because of the amount of generative AI material that was being submitted
145
00:10:19,565 --> 00:10:23,805
to it, it had to impose a limit on the number of books you could publish.
146
00:10:24,745 --> 00:10:28,265
So I just ask you to imagine what you think that limit should be.
147
00:10:29,205 --> 00:10:32,445
And Amazon's limit was only three books per day.
148
00:10:33,240 --> 00:10:39,060
So, you could only write three books a day. Now, clearly, that's ridiculous.
149
00:10:39,280 --> 00:10:40,900
And most of this is garbage.
150
00:10:41,260 --> 00:10:45,040
But I think in addition to the worry that I kind of talk about in the book about
151
00:10:45,040 --> 00:10:47,620
what happens to employment if AI is taking a lot of our jobs,
152
00:10:48,100 --> 00:10:52,500
you can also imagine what happens if just the world of information is flooded
153
00:10:52,500 --> 00:10:56,620
with so much content, much of which would be garbage, or, as you point out,
154
00:10:56,680 --> 00:11:00,960
so much human-looking material that, I mean, right now,
155
00:11:01,240 --> 00:11:03,920
while we're talking, the primaries are going on in New Hampshire,
156
00:11:03,980 --> 00:11:06,780
and there's a story that Joe Biden has been calling people up.
157
00:11:07,060 --> 00:11:11,020
And of course, it's not Joe Biden. It's a robocall version of his voice.
158
00:11:11,360 --> 00:11:16,760
And so, this is a real challenge to not only the economics of,
159
00:11:16,920 --> 00:11:20,080
in a modest way, writing, but our relationship with information,
160
00:11:20,380 --> 00:11:21,700
our relationship with the truth.
161
00:11:21,920 --> 00:11:25,380
I think that's That's why many people are worried about generative AI producing
162
00:11:25,380 --> 00:11:28,220
ever greater quantities of ever more realistic information.
163
00:11:28,680 --> 00:11:33,660
If that means that fake news is going to become all the news that there is to consume.
164
00:11:34,340 --> 00:11:37,980
Artifice is the name of the book, and it's certainly worth it.
165
00:11:38,140 --> 00:11:41,700
Really something that is hitting front and center right now.
166
00:11:41,920 --> 00:11:46,300
We're all thinking about AI right now. So, you know, really fascinating.
167
00:11:46,560 --> 00:11:51,300
And it's something that people are talking about. People are trying to protect
168
00:11:51,300 --> 00:11:54,920
not only their voice, but also their image, too.
169
00:11:55,120 --> 00:12:03,520
You know, actors in particular, they don't want CGI versions of themselves without their permission.
170
00:12:03,780 --> 00:12:07,480
So you're going to see people doing what James Earl Jones did,
171
00:12:07,620 --> 00:12:10,040
licensing their voice and their image.
172
00:12:10,140 --> 00:12:16,520
So after they're gone, they can be making movies ad infinitum and they can go towards their estate.
173
00:12:16,520 --> 00:12:22,320
It's really a fascinating time and potentially dangerous one,
174
00:12:22,420 --> 00:12:25,900
too, because obviously you can manipulate so much more.
175
00:12:26,060 --> 00:12:30,420
As far as an audio book, is there one available or is that in the planning stages?
176
00:12:30,420 --> 00:12:33,240
Ages my previous books done a few this
177
00:12:33,240 --> 00:12:36,680
is my first general fiction book i've done some young adult fiction and
178
00:12:36,680 --> 00:12:39,440
that's been done by storytell which has
179
00:12:39,440 --> 00:12:42,300
done an audio book and i'll be working with the publishers on
180
00:12:42,300 --> 00:12:45,260
this but it's only coming out in hardback in
181
00:12:45,260 --> 00:12:49,920
um in paper copy in the united states in february but it's also available on
182
00:12:49,920 --> 00:12:54,780
kindle and so on uh right now but i know i'd be delighted to explore all may
183
00:12:54,780 --> 00:12:59,140
all media if anyone's listening and wants to talk about a film deal by uneasily
184
00:12:59,140 --> 00:13:03,820
Googled and I would be happy to explore that also. So, yes, audio should be coming, I think.
185
00:13:04,532 --> 00:13:08,592
Let's talk about getting it as an e-book, for example.
186
00:13:09,212 --> 00:13:14,412
All your works could be e-books, and they can literally have all of them on
187
00:13:14,412 --> 00:13:16,472
their device, their Kindle, even their phone.
188
00:13:16,952 --> 00:13:22,432
What's that like for an author to have? Yes, and I'm a proponent of having that
189
00:13:22,432 --> 00:13:26,652
tactile experience of holding the book in your hands and reading it and bending
190
00:13:26,652 --> 00:13:29,492
the spine, as I tell all the authors I love.
191
00:13:29,792 --> 00:13:33,552
Because this way, you know the book's been read several times, probably. Wow.
192
00:13:33,712 --> 00:13:40,412
So what's that like for you with having this other way of digesting your work?
193
00:13:40,672 --> 00:13:42,332
That's a really interesting question.
194
00:13:42,612 --> 00:13:48,492
I tend to divide myself between most of my nonfiction work, my professional
195
00:13:48,492 --> 00:13:51,852
work I look at online, even there's a journal that I edit.
196
00:13:52,012 --> 00:13:55,832
And even that one, I have free copies that are two meters away. I look at it online.
197
00:13:56,272 --> 00:14:01,112
But fiction, I really do like like that experience of reading, picking it up.
198
00:14:01,212 --> 00:14:05,412
There are actually some studies that suggest that comprehension is a little bit greater.
199
00:14:05,612 --> 00:14:09,792
Your attention is focused more if you're holding a physical copy.
200
00:14:10,132 --> 00:14:12,972
But as an author, I'm just happy if I'm being read.
201
00:14:13,272 --> 00:14:16,212
I mean, there's a modest difference in terms of the royalties,
202
00:14:16,292 --> 00:14:20,912
but I've had to explain to my kids when I published my first novel, no, I could not retire.
203
00:14:21,092 --> 00:14:25,512
This was not gonna fund sort of big family vacations. This was partly a hobby.
204
00:14:26,093 --> 00:14:29,673
I suppose there isn't the back of one's mind, the worry that the digital form
205
00:14:29,673 --> 00:14:30,973
is much more easily pirated.
206
00:14:31,313 --> 00:14:34,793
But yeah, if it comes down to it, really, if anyone's reading the work,
207
00:14:34,853 --> 00:14:38,833
enjoying it, engaging with it, that's my main interest.
208
00:14:39,033 --> 00:14:43,253
And if they prefer to do that online, all to the good. The book's available everywhere.
209
00:14:43,593 --> 00:14:47,773
Amazon, I saw, which is obviously a monster as far as books.
210
00:14:48,333 --> 00:14:53,093
Is this the first of a series you might think, or is it a standalone?
211
00:14:53,473 --> 00:14:58,113
So it's a standalone book. I could imagine possibly coming back to the characters,
212
00:14:58,313 --> 00:15:02,153
but I've done a trilogy, a young adult trilogy, Raising Arcadia,
213
00:15:02,313 --> 00:15:05,273
which was really plotted out as a trilogy.
214
00:15:05,473 --> 00:15:10,653
This, I want it to be a standalone and a sort of digestible form. It's 192 pages.
215
00:15:11,153 --> 00:15:14,533
If people criticize it for being too short, I suppose that's better than being
216
00:15:14,533 --> 00:15:15,873
criticized for being too long.
217
00:15:16,213 --> 00:15:20,953
But I'm hoping it's the kind of book that people can digest reasonably quickly,
218
00:15:21,193 --> 00:15:25,133
but then we'll stay with them. And the ideas that they'll percolate over that
219
00:15:25,133 --> 00:15:26,113
they'll maybe talk about.
220
00:15:26,233 --> 00:15:31,093
I was delighted, indeed, really honored that my old school in Australia,
221
00:15:31,433 --> 00:15:34,493
where I grew up, is going to set it as a text in English.
222
00:15:34,833 --> 00:15:40,933
And so a bunch of teenagers in late secondary school will be looking at it,
223
00:15:40,953 --> 00:15:43,993
thinking about these issues, not just of AI and climate change,
224
00:15:44,133 --> 00:15:50,113
but what happens to the economy and politics politics when jobs become reduced
225
00:15:50,113 --> 00:15:55,013
to tasks and huge numbers of the population are no longer really engaged economically
226
00:15:55,013 --> 00:15:56,353
for the purpose of production.
227
00:15:56,853 --> 00:15:59,173
Instead, they're engaged for the purpose of consumption.
228
00:15:59,833 --> 00:16:07,413
And so, yeah, the hope is that people will engage with it and,
229
00:16:07,493 --> 00:16:08,993
yeah, it'll stick with them.
230
00:16:09,133 --> 00:16:12,513
But I suppose if there's sufficient demand, I would think about writing a sequel, sure.
231
00:16:13,283 --> 00:16:16,763
Yeah, you mentioned Singapore. Obviously, you didn't have to do a lot of research
232
00:16:16,763 --> 00:16:20,703
as to what it's like to live there since you've been there so long.
233
00:16:20,863 --> 00:16:24,503
So that part of the world building is pretty easy.
234
00:16:24,703 --> 00:16:29,023
So that was really cool. It's a beautiful part of the world.
235
00:16:29,243 --> 00:16:33,723
My wife's been into Thailand and that area and just loves it.
236
00:16:33,823 --> 00:16:36,843
I know they have beaches and things like that. Are you near the beach?
237
00:16:36,963 --> 00:16:41,743
I'm not giving any major details where you live, but what are you in the city?
238
00:16:41,743 --> 00:16:44,323
Well, Singapore is indeed a beautiful place.
239
00:16:44,423 --> 00:16:49,223
During the pandemic, we loved it, but we loved all 725 square kilometres of it.
240
00:16:49,643 --> 00:16:52,223
So, I mean, it's a kind of city-state.
241
00:16:52,823 --> 00:16:57,983
But no, we are lucky that we live inland, but we live near a forest or reservoir.
242
00:16:58,563 --> 00:17:02,903
So, it's not enormous, but you can get some greenery. But you mentioned the beach.
243
00:17:03,063 --> 00:17:06,243
I mean, one of the ways in which Singapore has expanded its territory is through
244
00:17:06,243 --> 00:17:10,963
land reclamation. So there is the famous Raffles Hotel, which is on Beach Road,
245
00:17:11,183 --> 00:17:14,363
which is now about a kilometre and a half from the nearest water,
246
00:17:14,503 --> 00:17:16,383
because all the land has been reclaimed.
247
00:17:17,163 --> 00:17:21,843
But that's just one example of how Singapore, I think, is an interesting place
248
00:17:21,843 --> 00:17:23,343
to locate a story like this,
249
00:17:23,423 --> 00:17:28,223
because this sort of is the antithesis of Singapore's growth through land reclamation,
250
00:17:28,223 --> 00:17:34,223
because it imagines sort of rising seas and a kind of semi-real project that's
251
00:17:34,223 --> 00:17:38,443
underway at the moment to protect Singapore from rising sea levels with sea walls.
252
00:17:38,623 --> 00:17:43,523
Not quite as extreme as the one in the book, but that idea of whether a tiny
253
00:17:43,523 --> 00:17:48,943
island like Singapore, which depends so much for its economy, for its prosperity,
254
00:17:49,123 --> 00:17:53,583
on its ability to connect with the world, What would happen if it was somewhat
255
00:17:53,583 --> 00:17:57,743
isolated by climate change, by the need to protect itself from rising sea levels,
256
00:17:57,843 --> 00:18:02,463
from the impact of the climate catastrophe was also something interesting to play around with.
257
00:18:02,903 --> 00:18:05,683
But it is also a bit intimidating when your friends, relatives,
258
00:18:06,103 --> 00:18:09,263
co-workers are reading about your account of their country.
259
00:18:09,583 --> 00:18:13,503
People take a little bit more personally sometimes, but thus far,
260
00:18:13,583 --> 00:18:17,023
to touch wood, the reception has been mostly positive. You work at a university
261
00:18:17,023 --> 00:18:22,363
and you're a vice provost. And I'm not sure exactly what that is.
262
00:18:22,423 --> 00:18:23,583
If you could explain that, please.
263
00:18:24,583 --> 00:18:29,463
So I teach in the law school, but also have set up an interdisciplinary college
264
00:18:29,463 --> 00:18:31,983
called NUS College and vice provost.
265
00:18:32,183 --> 00:18:34,903
Basically, the provost is the head academic.
266
00:18:35,563 --> 00:18:40,403
So, he's my boss, and I'm not in charge of promoting advice within the university.
267
00:18:40,663 --> 00:18:44,723
I'm one of his deputies. So, I've got the highfalutin title of Vice Provost
268
00:18:44,723 --> 00:18:45,883
of Educational Innovation.
269
00:18:46,383 --> 00:18:49,483
And that's actually relevant to the book as well, because what that really means
270
00:18:49,483 --> 00:18:52,643
is trying to rethink education in an age of AI.
271
00:18:53,043 --> 00:18:58,743
I mean, it's kind of shocking that around the world, academic Twitter woke up
272
00:18:58,743 --> 00:19:02,943
to ChatGPT and thought the main impact of this transformative technology of
273
00:19:02,943 --> 00:19:06,563
generative AI is that our students might cheat in their exams.
274
00:19:07,063 --> 00:19:11,043
Now, that's a real possibility, but the technology is clearly going to have
275
00:19:11,043 --> 00:19:12,283
bigger implications than that.
276
00:19:12,763 --> 00:19:16,543
And so, some of the things that we're working through is to look at,
277
00:19:16,603 --> 00:19:21,703
much as the book tries to describe, what skills, what qualities are going to
278
00:19:21,703 --> 00:19:25,523
be necessary to succeed and thrive 20 years from now?
279
00:19:25,603 --> 00:19:28,563
Because that's when my graduates will be having their own families.
280
00:19:28,783 --> 00:19:32,303
And how do we teach them? How do we assess them?
281
00:19:32,703 --> 00:19:37,063
And what is the role of a university? And indeed, what's the role of an academic
282
00:19:37,063 --> 00:19:42,063
like me when so much of the world's information can now be generated at the press of a button?
283
00:19:42,503 --> 00:19:46,563
And so, yeah, that's a long answer to a short question, but it's a really interesting
284
00:19:46,563 --> 00:19:48,683
time to be thinking about the future of higher education.
285
00:19:49,143 --> 00:19:51,903
Artifice is available wherever you get your books.
286
00:19:52,583 --> 00:19:57,463
And I want to remind you about Sci-Fi Talk Plus. The special offer is still
287
00:19:57,463 --> 00:20:01,283
going on, and it's good for you and your friends and your family.
288
00:20:01,943 --> 00:20:07,023
There's over 900 episodes, commercial-free, uncut, and even special programs.
289
00:20:07,363 --> 00:20:08,943
The best part, it's free.
290
00:20:09,663 --> 00:20:12,763
Click on the link in the show notes for free lifetime access.
291
00:20:13,303 --> 00:20:17,803
But this special offer will expire, so take advantage of it.
292
00:20:17,903 --> 00:20:20,863
This is Tony Tolato. Thanks for listening.