Transcript
1
00:00:00,017 --> 00:00:04,117
This is Sci-Fi Talk, the podcast on how sci-fi, fantasy, horror,
2
00:00:04,177 --> 00:00:06,617
and comics help us explore humanity.
3
00:00:07,097 --> 00:00:12,297
And this is Time Capsule, episode 399, and I'm Tony Tolato.
4
00:00:12,817 --> 00:00:15,837
We start this edition off with a couple of authors.
5
00:00:16,097 --> 00:00:22,577
First is Simon Chesterman, who tackles artificial intelligence in artifice with
6
00:00:22,577 --> 00:00:26,477
an AI Janus that might not be totally truthful.
7
00:00:26,817 --> 00:00:34,197
I'm an academic that spends a a lot of my time looking at the important questions about governing AI.
8
00:00:34,417 --> 00:00:37,337
I wrote a book about the regulation of artificial intelligence.
9
00:00:37,817 --> 00:00:43,557
And my reward to myself after I finished that very earnest, well-footnoted,
10
00:00:43,577 --> 00:00:46,177
well-researched book was to be a bit speculative.
11
00:00:46,577 --> 00:00:51,097
And so the research really came down to trying to think through the conversations
12
00:00:51,097 --> 00:00:54,637
I'd had, the books that I'd read, both fiction and nonfiction,
13
00:00:54,717 --> 00:00:59,697
about artificial intelligence that would not fit into of a very rigorous,
14
00:00:59,937 --> 00:01:04,317
academically researched nonfiction book that I was doing, but could be part of a creative project.
15
00:01:04,757 --> 00:01:09,797
And so it really came out of my own passion as a kid growing up reading Isaac
16
00:01:09,797 --> 00:01:14,417
Asimov and getting to know some of the people who were at the cutting edge of this technology,
17
00:01:14,517 --> 00:01:19,037
and then just sort of drawing a few more data points out into the future as
18
00:01:19,037 --> 00:01:24,377
I speculated about where we might go with technology and where technology might go with us. Yes.
19
00:01:24,737 --> 00:01:29,817
And creating Janice, how did you kind of go about creating the AI for the story?
20
00:01:30,691 --> 00:01:35,331
Yeah, so I mentioned I loved Asimov, as I think many people did in the 20th century.
21
00:01:35,411 --> 00:01:39,451
And there are some problems with Asimov, because he was a great science fiction writer.
22
00:01:39,731 --> 00:01:43,711
But I think he put in our minds the idea that artificial intelligence would
23
00:01:43,711 --> 00:01:48,511
be human level and human shaped, the kind of androids walking around among us.
24
00:01:48,631 --> 00:01:50,391
And of course, there's no reason for that.
25
00:01:50,611 --> 00:01:54,991
There's no reason for artificial intelligence to be in human form or limited
26
00:01:54,991 --> 00:01:58,871
to human level. And so, as I thought about what Janus might be,
27
00:01:59,311 --> 00:02:02,811
the gist of it really started with an idea of a conversation.
28
00:02:03,111 --> 00:02:09,171
Now, this predated chat GPT, but I think what I was trying to capture was the
29
00:02:09,171 --> 00:02:12,751
experience many of us had with generative AI over the last year or so,
30
00:02:12,891 --> 00:02:16,991
where we're interacting with something that's kind of human,
31
00:02:17,051 --> 00:02:22,811
but not quite human, that understands us, is trying to understand what we want
32
00:02:22,811 --> 00:02:26,351
and how to give it to us, but is not quite human.
33
00:02:26,411 --> 00:02:31,091
So I really want to play around with that idea of something uncannily human-like,
34
00:02:31,211 --> 00:02:35,371
but at the same time, really pushing back against the idea that it needs to
35
00:02:35,371 --> 00:02:38,231
be limited to human form or indeed embodied at all.
36
00:02:38,471 --> 00:02:47,951
So there's almost a Hal 9000 quality to Janus in the sense that it keeps the
37
00:02:47,951 --> 00:02:51,371
truth about its own identity to itself.
38
00:02:51,711 --> 00:02:56,691
Like Hal was doing it for what he perceived was the mission and security.
39
00:02:57,231 --> 00:03:00,991
But Janus almost has a devious quality to it.
40
00:03:01,191 --> 00:03:04,531
Yeah. And I think that's actually one of the really interesting things with
41
00:03:04,531 --> 00:03:09,491
our current experience of AI is that for the most part, AI will try and tell
42
00:03:09,491 --> 00:03:13,651
you the truth, will try and tell you what you're wanting to find out.
43
00:03:13,911 --> 00:03:17,111
But as we're discovering with generative AI and hallucinations,
44
00:03:17,131 --> 00:03:18,451
sometimes that doesn't work out.
45
00:03:18,711 --> 00:03:23,811
And actually, there are a few examples of computers that have have learned basically to cheat.
46
00:03:24,885 --> 00:03:27,805
I mean, for the most part, when we talk about AI bias and so on,
47
00:03:27,825 --> 00:03:30,105
what it really means is that the data is biased.
48
00:03:30,245 --> 00:03:33,825
And if you're asking the AI system if it's biased, it will try and tell you
49
00:03:33,825 --> 00:03:38,045
the truth, whereas no human is going to admit to being racist, sexist, and so on.
50
00:03:38,245 --> 00:03:43,285
But there is the suggestion that if you frame instructions generally,
51
00:03:43,645 --> 00:03:48,765
AI systems can be creative in achieving the overall solution,
52
00:03:48,965 --> 00:03:53,245
even if that means cutting some corners ethically or even legally along the way.
53
00:03:53,245 --> 00:03:58,565
But your example of hell is, of course, an iconic both movie and book-length
54
00:03:58,565 --> 00:04:00,405
treatment of an AI system.
55
00:04:00,585 --> 00:04:05,725
And I was in some ways trying to push back against that idea that the robot is always bad.
56
00:04:06,225 --> 00:04:11,405
And there's an underlying question. You know, a lot of the kind of existential
57
00:04:11,405 --> 00:04:16,045
angst people feel about artificial intelligence, worrying should we trust it?
58
00:04:16,105 --> 00:04:17,585
Can we trust it? Is it safe?
59
00:04:17,705 --> 00:04:22,245
Is it reliable? Is it going to turn on us the way that Hal turned on Dave refusing
60
00:04:22,245 --> 00:04:24,645
to open the pod bay doors and so on?
61
00:04:25,045 --> 00:04:30,365
And in some ways, I want to flip that around on its head and imagine in the
62
00:04:30,365 --> 00:04:35,405
case of Janus's AI system, where the question that Janus is asking at a key
63
00:04:35,405 --> 00:04:38,665
moment in the book, Janus does ask the protagonist this,
64
00:04:38,805 --> 00:04:44,165
in all of your efforts to try and develop AI and all of your hysterical conferences
65
00:04:44,165 --> 00:04:48,125
worrying about me, trying to imprison me, to limit me, you were worried about
66
00:04:48,125 --> 00:04:49,065
whether you could trust me.
67
00:04:49,345 --> 00:04:54,485
And it never appears to have crossed your mind whether I should have to trust you.
68
00:04:55,358 --> 00:04:58,938
And that was the kind of relationship I really want to play around with,
69
00:04:58,998 --> 00:05:04,978
the idea of an AI system really being uncertain about humanity in general and
70
00:05:04,978 --> 00:05:06,178
one human in particular.
71
00:05:06,598 --> 00:05:09,858
And so it was a lot of fun to play around with, and hopefully people find it
72
00:05:09,858 --> 00:05:11,038
interesting to read about.
73
00:05:11,398 --> 00:05:17,198
Next up is Sarah Beth Gerst with a fascinating new novel, Lies Among Us.
74
00:05:17,418 --> 00:05:21,798
That looks at lies, of course, but also what is truth. Ruth,
75
00:05:22,258 --> 00:05:25,338
how did this story kind of find you?
76
00:05:25,558 --> 00:05:30,518
Through the concept, which is actually very different from how other stories find me.
77
00:05:31,058 --> 00:05:36,758
I always start with the small little idea snippet, a little like spark.
78
00:05:37,258 --> 00:05:40,958
There's this myth that ideas come to writers as these big lightning strikes.
79
00:05:41,098 --> 00:05:43,878
And suddenly you know everything about the world and the story and you're inspired
80
00:05:43,878 --> 00:05:48,278
and the muse is there. And yeah, you go off and write. And it rarely works that way.
81
00:05:48,458 --> 00:05:53,098
When it does, it's great. But with this, it was the concept of lies.
82
00:05:53,618 --> 00:05:58,498
That was it. It was all I knew at the beginning. And I actually sat down and made this list.
83
00:05:58,758 --> 00:06:03,878
Ended up being like two or three pages of all the different ways that people lie to one another.
84
00:06:04,178 --> 00:06:09,578
Little lies, big lies, meaningless lies, nice lies, lies made out of kindness.
85
00:06:09,738 --> 00:06:13,478
This whole list of the ways that lies permeate our lives.
86
00:06:13,878 --> 00:06:17,818
And that's where I began. And the story unfolds from there. Yeah.
87
00:06:18,258 --> 00:06:24,018
Well, talk about something that's very timely. We seem to have right now in
88
00:06:24,018 --> 00:06:28,558
America and maybe some other parts of the world to a problem with the truth.
89
00:06:28,758 --> 00:06:35,398
So, you know, everybody seems to have their own truth and and there's not a
90
00:06:35,398 --> 00:06:37,818
lot of common ground there.
91
00:06:37,978 --> 00:06:42,018
So I think a book like this just touches on that.
92
00:06:42,098 --> 00:06:46,898
When you wrote this, was that something in the back of your mind or you just.
93
00:06:46,898 --> 00:06:48,958
It was very much the phenomenon.
94
00:06:49,158 --> 00:06:51,298
Really? Oh, cool. Yeah, it was.
95
00:06:52,117 --> 00:06:58,437
As you said, it's pervasive in our society right now, all through politics, all over social media.
96
00:06:58,557 --> 00:07:02,877
It's a huge, huge issue that just invades all our lives.
97
00:07:03,297 --> 00:07:07,837
And I really, I wanted in this book to address it from a really personal level.
98
00:07:08,117 --> 00:07:11,037
But I think that's one of the things that fiction allows us to do is to really
99
00:07:11,037 --> 00:07:18,957
explore the human condition and get at universal truth through a very personal story.
100
00:07:19,897 --> 00:07:23,037
That's why I tried to start. All right.
101
00:07:23,057 --> 00:07:29,297
So I'm actually also looking at your website and sarahbethdurst.com.
102
00:07:29,637 --> 00:07:34,937
And I'm going to read a little bit of the description. I want you to get your comments here.
103
00:07:35,457 --> 00:07:38,877
And it starts off with, I'm going to paraphrase it, actually.
104
00:07:38,877 --> 00:07:41,917
Hannah is the main character her mother is
105
00:07:41,917 --> 00:07:46,357
dies and apparently the
106
00:07:46,357 --> 00:07:49,457
funny thing is as far as Hannah's
107
00:07:49,457 --> 00:07:57,837
concerned nobody actually sees her or listens to her and you know she's confused
108
00:07:57,837 --> 00:08:02,937
she's dealing with grief of her mother's death and she you know is trying now
109
00:08:02,937 --> 00:08:06,477
to kind of to find her way in life after these circumstances.
110
00:08:07,577 --> 00:08:15,097
And then her older sister, Leah, doesn't seem to acknowledge her.
111
00:08:15,597 --> 00:08:19,817
And it makes me wonder, is she really there?
112
00:08:19,957 --> 00:08:22,917
So kind of comment on that.
113
00:08:23,057 --> 00:08:28,157
And again, it's lies and truth and maybe a little bit of a Philip K.
114
00:08:28,197 --> 00:08:31,117
Dick-like existence, too. Mm-hmm. Yeah.
115
00:08:32,123 --> 00:08:36,643
The book is about a woman who doesn't exist ever. No one can see her.
116
00:08:36,783 --> 00:08:39,803
No one can hear her. No one can touch her. She can't touch anyone.
117
00:08:39,983 --> 00:08:42,043
She can't participate in a conversation.
118
00:08:42,623 --> 00:08:48,603
She does not exist. And it's seen through her eyes, alternating with life through
119
00:08:48,603 --> 00:08:53,723
the eyes of her sister, who very much does exist and has never seen or heard
120
00:08:53,723 --> 00:08:55,223
or spoken with this sister.
121
00:08:55,703 --> 00:08:59,263
And then she's part of part of this world. world the way
122
00:08:59,263 --> 00:09:02,183
hannah sees the world is she sees it
123
00:09:02,183 --> 00:09:05,663
almost in layers she sees the intentions
124
00:09:05,663 --> 00:09:08,903
of other people the dreams like the house
125
00:09:08,903 --> 00:09:11,723
that they both grew up in hannah sees it
126
00:09:11,723 --> 00:09:15,123
as this beautiful two-story home with a pristine kitchen
127
00:09:15,123 --> 00:09:18,323
and a garden out in front and her sister
128
00:09:18,323 --> 00:09:21,683
leah sees it as a dilapidated one-story house
129
00:09:21,683 --> 00:09:24,543
that the house has never been painted there's junk in
130
00:09:24,543 --> 00:09:28,023
the front yard nobody's picked up a newspaper in forever and
131
00:09:28,023 --> 00:09:31,183
it smells like mold from the kitchen and they
132
00:09:31,183 --> 00:09:34,163
just they experience the world through
133
00:09:34,163 --> 00:09:40,343
two different lenses that that interlock so yes you know i was saying before
134
00:09:40,343 --> 00:09:45,003
how how fiction allows us to explore the the human condition well one of the
135
00:09:45,003 --> 00:09:49,923
things i consider this book club fiction with a speculative edge by which i
136
00:09:49,923 --> 00:09:52,183
mean It's about a concept,
137
00:09:52,343 --> 00:09:54,923
but I use speculative fiction.
138
00:09:55,463 --> 00:10:02,443
It's such an amazing tool for exploring the extremes of a concept. at.
139
00:10:02,663 --> 00:10:08,343
You can do that with science fiction and fantasy in a way that you really can't in other genres.
140
00:10:08,423 --> 00:10:14,283
It lets you push it to such an extreme to really see what does this mean?
141
00:10:14,343 --> 00:10:20,863
What does this mean when I push a character, a personality through this gauntlet
142
00:10:20,863 --> 00:10:26,303
to the absolute limits of what this concept means and what does it do to them?
143
00:10:26,343 --> 00:10:28,563
How do they survive that? What does it say about us?
144
00:10:29,003 --> 00:10:33,943
Maybe it's pretty common for all of us to hit a wall after lunch.
145
00:10:34,303 --> 00:10:37,263
And I've talked about it here on this podcast before.
146
00:10:37,723 --> 00:10:43,343
So I was looking for the right energy boosting drink and I found one.
147
00:10:43,583 --> 00:10:48,123
I eliminated the heavy caffeinated drinks. They're just not right for me.
148
00:10:48,223 --> 00:10:51,023
But I found an alternative in Magic Mind.
149
00:10:51,223 --> 00:10:55,183
It gives me the boost that I so desperately need to complete my day,
150
00:10:55,183 --> 00:10:57,623
not only for work, but also for playing.
151
00:10:58,428 --> 00:11:02,948
I could go over the key ingredients, but frankly, I would just totally ruin the pronunciation.
152
00:11:03,408 --> 00:11:08,788
So I'm not going to go there. But what I do see is what those ingredients do do.
153
00:11:08,968 --> 00:11:13,468
And they reduce stress, anxiety, has way less caffeine.
154
00:11:13,748 --> 00:11:19,748
It improves my attention span, memory, and the ability to process and learn new information.
155
00:11:20,048 --> 00:11:23,288
And to boot, it also strengthens my immune system.
156
00:11:23,608 --> 00:11:29,468
God, we all need all those things. Now, if you go to magicmind.com forward slash
157
00:11:29,468 --> 00:11:35,408
Jan, J-A-N in caps and small letters, Tony T,
158
00:11:35,668 --> 00:11:38,948
you can get a month free when you subscribe for three months.
159
00:11:39,268 --> 00:11:47,108
Use the code TONYT in caps 20 to get that free month. That's TONYT20.
160
00:11:47,488 --> 00:11:53,968
It's an extra 20% off, which gets you to a 75% off. And this only lasts until
161
00:11:53,968 --> 00:11:58,048
the end of January, so hurry up before it's already gone.
162
00:11:58,308 --> 00:12:01,508
An easy way is just to click on the link in the show notes.
163
00:12:01,788 --> 00:12:07,588
It works for me, and it gives me the boost I so desperately need every single day.
164
00:12:08,068 --> 00:12:14,948
At the end of last season of For All Mankind, I had a chance to speak to Edgar Theggy, who plays Dev.
165
00:12:15,608 --> 00:12:20,648
When you first read this character, Dev, what jumped out at you?
166
00:12:20,648 --> 00:12:25,028
Black billionaire i said i
167
00:12:25,028 --> 00:12:27,848
gotta do that i gotta i gotta have a chance
168
00:12:27,848 --> 00:12:35,268
right now to play a in black billionaire that might predate robert johnson who
169
00:12:35,268 --> 00:12:40,448
was technically the first black billionaire and i felt like it would be a really
170
00:12:40,448 --> 00:12:47,508
great opportunity to have that kind of character reflected in my community, you know.
171
00:12:48,508 --> 00:12:53,508
The interesting thing about this show is, and the thing that they do really
172
00:12:53,508 --> 00:12:56,748
well, is that they make every character controversial.
173
00:12:57,668 --> 00:13:03,008
So what ends up happening is that you're playing a character that's deeply complex
174
00:13:03,008 --> 00:13:08,228
and deeply flawed, which is deeply human.
175
00:13:09,088 --> 00:13:15,188
So in many ways, the rollercoaster that Dev goes on through his arc in the season,
176
00:13:15,368 --> 00:13:21,268
is a similar rollercoaster that I went on just in wrapping my brain around playing it.
177
00:13:21,988 --> 00:13:25,988
Fascinating character. Won't give anything away as to the context,
178
00:13:26,188 --> 00:13:31,488
but what's it like working with Chantal Van Saten? You guys had some great scenes
179
00:13:31,488 --> 00:13:34,448
and you can almost feel electricity in the room.
180
00:13:35,365 --> 00:13:42,405
She's wonderful. She's a true artist. She cares deeply about her work.
181
00:13:42,625 --> 00:13:48,405
She's the kind of artist that writes a Bible for her character, but keeps it alive.
182
00:13:48,785 --> 00:13:54,405
Goes back every night and writes ideas and puts pictures on the wall in her
183
00:13:54,405 --> 00:13:58,585
trailer and lives and breathes and sleeps the life of this person.
184
00:13:59,265 --> 00:14:03,565
And it doesn't get in the way of her generosity and spirit on the set.
185
00:14:03,565 --> 00:14:04,985
You know, she's there to collaborate.
186
00:14:05,245 --> 00:14:09,125
She wants to have conversations on her days off and grab lunch and chitchat
187
00:14:09,125 --> 00:14:12,025
about moments. And it was a joy.
188
00:14:12,165 --> 00:14:17,065
She's become a very close friend of mine because of how big her heart is for the work.
189
00:14:18,745 --> 00:14:23,405
There's a lot of technical talk for your character. Did you kind of research,
190
00:14:23,705 --> 00:14:28,625
you know, space exploration a little bit before you took the brawl?
191
00:14:29,245 --> 00:14:34,325
You know, it's funny. One of my first TV shows was House and,
192
00:14:34,445 --> 00:14:37,805
you know, I damn near wanted to put myself through medical school just to play
193
00:14:37,805 --> 00:14:42,085
that part with all the medical jargon, you know, about all the books and talk, talk to doctors.
194
00:14:42,765 --> 00:14:46,725
And I learned from one of the actors who I won't name, you know,
195
00:14:46,725 --> 00:14:51,785
if you just say that word, like, you know what it means, the audience is going
196
00:14:51,785 --> 00:14:53,065
to believe that you know what it means.
197
00:14:53,725 --> 00:14:57,745
So you don't have to work that hard, Eddie. I didn't really take their notes
198
00:14:57,745 --> 00:15:00,665
because I was still young and excited to do the work.
199
00:15:01,165 --> 00:15:05,165
But I guess, you know, all those years later on For All Mankind,
200
00:15:05,265 --> 00:15:08,885
I'm not rushing to join NASA just to play this part.
201
00:15:09,125 --> 00:15:13,405
You know, I'm not trying to start up a tech company so I can learn how to mine
202
00:15:13,405 --> 00:15:16,645
for, you know, new resources.
203
00:15:17,905 --> 00:15:22,165
So the answer, the long answer to your really short question is no. know.
204
00:15:25,220 --> 00:15:30,000
I spoke to Jody Davis, who is the NASA Deputy Payload Systems Engineer.
205
00:15:30,540 --> 00:15:34,580
She's worked on missions such as Mars Phoenix and Mars Science Laboratory.
206
00:15:35,200 --> 00:15:41,260
And here is part of our conversation. I mean, the Mars rovers have been absolutely incredible.
207
00:15:41,460 --> 00:15:46,460
What they've already, we know so much more about Mars because of them.
208
00:15:46,620 --> 00:15:48,740
It's just, and then Cassini too.
209
00:15:48,960 --> 00:15:54,140
I mean, it's just, it's really, we've done some amazing exploring.
210
00:15:54,140 --> 00:15:59,000
Exploring without man, but it certainly would pave the way for a manned flight
211
00:15:59,000 --> 00:16:01,240
someday. So, which I'm always hopeful.
212
00:16:02,140 --> 00:16:07,740
Same here. And we tend to call those robotic missions, usually precursors,
213
00:16:07,740 --> 00:16:11,140
you know, precursors to those crewed missions.
214
00:16:11,640 --> 00:16:16,860
There's just things that you need to tease out and learn before you would send
215
00:16:16,860 --> 00:16:19,580
a human there. You're right.
216
00:16:20,820 --> 00:16:25,920
That's amazing. Amazing. Yeah, I mean, you know, obviously, one of the things
217
00:16:25,920 --> 00:16:34,640
is quest for for life and finding life out there, as I as I subscribe to in the movie Contact,
218
00:16:34,940 --> 00:16:38,980
if we're the only ones, that's an awful waste of space. Mm hmm.
219
00:16:39,600 --> 00:16:42,960
So what are your feelings on it? Oh, you think so?
220
00:16:43,600 --> 00:16:48,520
I, I agree with you, man. What a what a waste of space it would be.
221
00:16:48,520 --> 00:16:56,240
And I think it was Stephen Hawking, I think, said that about,
222
00:16:56,320 --> 00:16:58,120
you know, life in our universe.
223
00:16:58,620 --> 00:17:04,040
It's either, you know, we are totally and utterly alone and Earth was a freak
224
00:17:04,040 --> 00:17:06,640
occurrence or it's absolutely everywhere.
225
00:17:07,240 --> 00:17:12,020
And I actually would like to, whether Stephen said that or not,
226
00:17:12,180 --> 00:17:14,140
I actually would like to believe the latter.
227
00:17:14,140 --> 00:17:20,600
And I think we need to change our perception of what life should look like,
228
00:17:20,680 --> 00:17:25,140
you know, whether it's deep sea creatures under the icy,
229
00:17:25,320 --> 00:17:30,000
you know, in the icy waters under the ice shells of Europa or,
230
00:17:30,180 --> 00:17:37,300
you know, microbial life like we continue to search for and look for on Mars.
231
00:17:37,300 --> 00:17:41,700
Mars, it just depends on, I think, how you define life.
232
00:17:42,100 --> 00:17:43,960
But I'm hopeful.
233
00:17:45,160 --> 00:17:50,560
Yeah. What can you tell us about Cassini and what that experience was like for you?
234
00:17:51,340 --> 00:17:58,080
Yeah. So Cassini-Huygens was a joint mission between NASA and ESA,
235
00:17:58,200 --> 00:17:59,720
or the European Space Space Agency.
236
00:18:00,000 --> 00:18:04,580
Yeah. And the Huygens probe was the ESA contribution.
237
00:18:05,500 --> 00:18:10,520
And that's actually what I worked on when I was a grad student. Yeah.
238
00:18:10,880 --> 00:18:17,300
NASA Langley Research Center. My thesis was on entry, descent and landing at
239
00:18:17,300 --> 00:18:19,360
Titan, the largest moon of Saturn.
240
00:18:20,089 --> 00:18:24,529
And Titan is very interesting because it's a moon and it has an atmosphere and
241
00:18:24,529 --> 00:18:26,409
there's lots of methane. It's very cold.
242
00:18:26,809 --> 00:18:32,169
Some scientists believe that, you know, at that primordial state that Earth
243
00:18:32,169 --> 00:18:34,729
was when Earth was after Earth was formed.
244
00:18:34,829 --> 00:18:42,009
And there's a lot of interesting science that Titan. So, I actually worked my
245
00:18:42,009 --> 00:18:48,389
master's thesis on an airship, something that would fly around on Titan.
246
00:18:48,769 --> 00:18:53,509
And at that same time, we were getting ready to, in the Cassini mission,
247
00:18:53,709 --> 00:18:57,449
we were getting ready to release the Huygens probe into the Titan atmosphere. here.
248
00:18:58,069 --> 00:19:02,369
And we actually, ESA called NASA and said, hey, you know, we're not,
249
00:19:02,429 --> 00:19:06,249
make sure we characterize the radiative environment, you know,
250
00:19:06,249 --> 00:19:10,329
the radiative environment for heating, for example, on the heat shield.
251
00:19:10,549 --> 00:19:12,949
You know, can you guys take a look, independent look at this?
252
00:19:13,449 --> 00:19:19,469
So it was a three month turnaround. And we analyzed the Cassini entry descent
253
00:19:19,469 --> 00:19:23,909
and landing, Huygens probe entry descent and landing from top to bottom.
254
00:19:24,249 --> 00:19:29,569
And I was a part of, as a grad student, part of that team. And so we ended up,
255
00:19:29,569 --> 00:19:34,909
you know, helping ESA give the thumbs up, thumbs down for release of the Huygens probe.
256
00:19:35,249 --> 00:19:42,849
And when you talk about, you know, interdescent landing on Earth, it's very, very quick.
257
00:19:43,029 --> 00:19:46,629
The atmosphere is much, much, much thinner on Earth than it is at Titan.
258
00:19:47,249 --> 00:19:50,929
The Titan descent is on the orders of hours.
259
00:19:51,209 --> 00:19:57,469
It was almost like two and a half hours. And that is Time Capsule, episode 399.
260
00:19:57,609 --> 00:20:00,069
Ooh, the magic 400 is just around the corner.
261
00:20:00,389 --> 00:20:03,069
Thanks for listening. This is Tony Tolato.