
In this episode of EdUp Learning and Development, host Holly Owens interviews Fred Thompson, founder of Thirst.io. They discuss the intersection of technology, innovation, and learning, exploring the challenges faced by L&D professionals, the importance of personalization in learning, and how AI is transforming the educational landscape. Fred shares insights on building effective teams, the future of learning, and offers valuable advice for aspiring entrepreneurs.
Resources mentioned in this episode:
Guest Contact Information:
___________________________________
Episode Sponsor: iSpring Solutions
🎙️ Huge thanks to our friends at iSpring Solutions for sponsoring this episode of the EdUp L&D podcast! 🙌
If you haven’t already, be sure to check out the iSpring Learning Exchange Community — a vibrant space for creators, educators, and L&D pros to connect and grow.
Grateful for the support and excited to see what our community creates 💡
Thanks for tuning in! 🎧
00:00:00,040 --> 00:00:03,880
Hi everyone, and welcome back to
another episode of Edup Learning
2
00:00:03,880 --> 00:00:06,920
and Development.
I'm your host, Holly Owens, and
3
00:00:06,920 --> 00:00:10,240
today we're diving into the
intersection of technology,
4
00:00:10,640 --> 00:00:15,480
innovation, and learning with a
truly inspirational guest, Fred
5
00:00:15,480 --> 00:00:18,240
Thompson, founder of Thirst dot
IO.
6
00:00:18,920 --> 00:00:22,040
Fred's journey from software
development to the world of
7
00:00:22,040 --> 00:00:26,080
learning and development is not
only fascinating, it's packed
8
00:00:26,080 --> 00:00:30,000
with powerful lessons.
In this episode, we explore the
9
00:00:30,000 --> 00:00:34,640
real challenges L&D
professionals face today, why
10
00:00:34,640 --> 00:00:39,720
personalization matters more
than ever, and how AI is rapidly
11
00:00:40,120 --> 00:00:42,000
reshaping the educational
landscape.
12
00:00:42,960 --> 00:00:46,520
Fred also gives us a behind the
scenes look at how thirst is
13
00:00:46,520 --> 00:00:48,680
tackling these challenges head
on.
14
00:00:49,120 --> 00:00:52,760
And if you're an aspiring
entrepreneur entrepreneur,
15
00:00:52,880 --> 00:00:56,000
you'll definitely want to tune
into the part where he talks
16
00:00:56,000 --> 00:01:00,440
about building teams, trusting
talent, and leading with vision.
17
00:01:00,920 --> 00:01:05,319
So whether you're an L&D Ed tech
or just curious about the future
18
00:01:05,319 --> 00:01:08,080
of learning, this conversation
is for you.
19
00:01:08,560 --> 00:01:13,520
Let's get into it.
Hi, we're ispring, an
20
00:01:13,600 --> 00:01:17,280
international team of e-learning
enthusiasts who help more than
21
00:01:17,280 --> 00:01:21,720
60,000 clients across the globe
succeed with better online
22
00:01:21,720 --> 00:01:24,640
learning.
Our two flagship solutions are
23
00:01:24,640 --> 00:01:27,960
ispring Suite and ispring Learn
LMS.
24
00:01:28,680 --> 00:01:32,240
Ispring Suite is an intuitive,
all in one authoring tool for
25
00:01:32,240 --> 00:01:36,760
creating engaging e-learning
content, and ispringlearn is an
26
00:01:36,760 --> 00:01:39,520
innovative online training
platform for onboarding,
27
00:01:39,720 --> 00:01:42,440
upskilling, and certifying your
teams.
28
00:01:42,680 --> 00:01:46,160
We also provide tons of free
resources for aspiring and
29
00:01:46,160 --> 00:01:49,080
experienced e-learning
professionals, conduct weekly
30
00:01:49,080 --> 00:01:53,000
webinars with top industry
experts, and organize annual
31
00:01:53,000 --> 00:01:56,640
e-learning conferences,
challenges, and championships.
32
00:01:57,400 --> 00:02:00,640
We'd be happy to get to know you
and pick a solution that fits
33
00:02:00,640 --> 00:02:05,520
your needs best.
Go to www.icebringsolutions.com
34
00:02:05,760 --> 00:02:09,800
to learn more about us, download
our resources and connect.
35
00:02:11,480 --> 00:02:15,760
Hello everyone, and welcome to
another fabulous episode of Ed
36
00:02:15,760 --> 00:02:19,640
Up Learning and Development.
My name is Holly Owens and I'm
37
00:02:19,640 --> 00:02:23,400
your host and I'm super excited
today because I have Fred
38
00:02:23,400 --> 00:02:25,400
Thompson here.
I'm going to let him talk about
39
00:02:25,400 --> 00:02:28,240
himself.
So, Fred, welcome on into the
40
00:02:28,240 --> 00:02:30,560
show.
Yeah, Thanks for having me here,
41
00:02:30,560 --> 00:02:32,000
Holly.
Really excited about today.
42
00:02:32,680 --> 00:02:35,600
It's always strange when you're
sort of prompted to to talk
43
00:02:35,600 --> 00:02:37,560
about yourself a little bit.
You never know quite where to
44
00:02:37,560 --> 00:02:39,360
start.
Yeah.
45
00:02:39,480 --> 00:02:41,880
So tell us your story.
How did you get into like
46
00:02:41,880 --> 00:02:43,440
thirst, all the different things
that you're doing?
47
00:02:43,440 --> 00:02:47,400
Give us the low down.
Yeah, I mean, it's always
48
00:02:47,400 --> 00:02:48,760
strange.
You look back as well and think,
49
00:02:48,760 --> 00:02:51,640
how do you get here?
And you're never quite sure my
50
00:02:51,640 --> 00:02:53,800
background and history.
Sort of we're going back, you
51
00:02:53,800 --> 00:02:55,760
know, a couple of decades, which
is worrying, isn't it, when we
52
00:02:55,760 --> 00:02:57,680
think about it.
Yeah, I know when we when we
53
00:02:57,680 --> 00:03:01,520
think about it out loud,
software development has always
54
00:03:01,520 --> 00:03:03,000
been my thing.
The background is my, I'm a
55
00:03:03,000 --> 00:03:05,200
programmer by by the nature and
by trade.
56
00:03:05,200 --> 00:03:08,120
But ultimately that's where I
started building out and sort of
57
00:03:08,120 --> 00:03:12,440
fell into L and DA little bit in
terms of building out Macromedia
58
00:03:12,440 --> 00:03:16,000
Flash courses and things in all
the way back in the day.
59
00:03:16,000 --> 00:03:19,560
This was before the Adobe sort
of bought it and then building
60
00:03:19,560 --> 00:03:21,600
into learning management
systems, etcetera.
61
00:03:22,320 --> 00:03:23,920
And then just really branching
from there.
62
00:03:23,920 --> 00:03:27,320
And over time, as we've
progressed, it's been a case of,
63
00:03:27,320 --> 00:03:31,000
well, I'm not as good at the
programme as I once was or I
64
00:03:31,320 --> 00:03:35,760
like to think I was started
bringing in people to help me
65
00:03:35,760 --> 00:03:38,320
around, around and about as the
businesses kind of grew and
66
00:03:38,320 --> 00:03:39,840
said, OK, we've got more to do
here.
67
00:03:41,120 --> 00:03:44,280
And then as a result of that,
we've sort of been done over 2
68
00:03:44,280 --> 00:03:49,640
decades in the L&D space really
assisting businesses to put
69
00:03:49,640 --> 00:03:52,840
together learning, but using
technology to effectively
70
00:03:52,840 --> 00:03:55,600
deliver that.
So anything training, anything
71
00:03:55,600 --> 00:03:57,600
sort of learning management
system connected or learning
72
00:03:57,600 --> 00:04:00,160
platform connected, but we're
talking about anything it
73
00:04:00,160 --> 00:04:03,720
connects to like API systems,
external talking about the
74
00:04:04,640 --> 00:04:09,200
strategies around that.
And yeah, we we focus, we are
75
00:04:09,560 --> 00:04:13,440
primarily sort of you know tech
focused as a business, but with
76
00:04:13,440 --> 00:04:17,480
the background knowledge really
of the space of knowing, but
77
00:04:17,640 --> 00:04:20,880
probably most things there are
to know now in L&D really.
78
00:04:24,080 --> 00:04:27,920
I mean, it's so it's, it's so
nice to hear of a journey like
79
00:04:27,920 --> 00:04:30,720
you started out like in history,
like a lot of us started in
80
00:04:30,720 --> 00:04:32,840
different places.
Like I was a high school teacher
81
00:04:33,280 --> 00:04:35,560
and then you kind of found like
where your passion, where your
82
00:04:35,560 --> 00:04:38,480
niche was and you just went for
it because you're not just the
83
00:04:38,480 --> 00:04:40,840
founder of Thirst.
And if you haven't seen Thirst,
84
00:04:40,840 --> 00:04:43,440
you're, you're probably not on
LinkedIn a lot because their
85
00:04:43,440 --> 00:04:47,560
marketing is amazing.
And you've, you've definitely
86
00:04:47,560 --> 00:04:49,520
seen Barry post a lot of stuff
out there.
87
00:04:49,520 --> 00:04:53,760
He's their marketing director.
And if you haven't heard of
88
00:04:53,760 --> 00:04:55,840
thirst, you need to go hear of
thirst now.
89
00:04:55,840 --> 00:04:57,000
You need to go out to their
site.
90
00:04:57,000 --> 00:05:01,560
So being as that, you know, the
journey has been one that's been
91
00:05:01,680 --> 00:05:04,440
of different opportunities,
different things for you.
92
00:05:04,680 --> 00:05:08,400
When you think about the L&D
space, what do you think are
93
00:05:08,400 --> 00:05:10,960
some of the, the greatest
challenges for us?
94
00:05:11,040 --> 00:05:13,880
And like how it currently is
like the things happening with
95
00:05:13,920 --> 00:05:17,680
AI and, and all those things
that are kind of impacting our
96
00:05:17,680 --> 00:05:20,360
industry.
I think it's kind of the same
97
00:05:20,360 --> 00:05:22,800
across many industries and not
just L&D.
98
00:05:22,800 --> 00:05:26,520
And I think L&D has typically
been a little bit historically
99
00:05:26,520 --> 00:05:29,320
behind in terms of just trying
to make sure it moves forward.
100
00:05:29,320 --> 00:05:32,840
And you know, I touched upon Mac
Media Flash there, but I mean
101
00:05:32,840 --> 00:05:35,960
we're still sort of using SCORM
courses and things like that and
102
00:05:35,960 --> 00:05:38,560
the, the early standards and we
didn't even move forward into
103
00:05:38,560 --> 00:05:40,840
the X API things really that
that well.
104
00:05:41,080 --> 00:05:43,480
So we've always been a little
bit backward thinking as a, as a
105
00:05:43,480 --> 00:05:46,600
sector, but you know, with this
new technologies that are
106
00:05:46,600 --> 00:05:49,000
current, you know, we talk about
AI loads, we've done quite a lot
107
00:05:49,000 --> 00:05:51,600
of that, but it's, it's so fast
moving.
108
00:05:52,200 --> 00:05:55,880
You can't quite even even if
you're not thinking about L&D
109
00:05:55,880 --> 00:05:57,800
and you're just trying to keep
up with the news, the latest
110
00:05:57,800 --> 00:06:00,560
sort of technology enhancements
there, you can't really do that.
111
00:06:00,920 --> 00:06:05,640
I went to an Amazon Web Services
conference where they would, I
112
00:06:05,640 --> 00:06:08,480
think the guys Anthropic were
there who was one of the AI
113
00:06:08,480 --> 00:06:11,240
providers and they were talking
about a new standard that they
114
00:06:11,240 --> 00:06:13,040
got.
And they said, oh, this, this
115
00:06:13,040 --> 00:06:15,360
standard's actually changed
since last time we presented it.
116
00:06:15,440 --> 00:06:18,200
And I think I went in, I don't
know, it was like April and they
117
00:06:18,200 --> 00:06:20,280
said we last presented it in
February.
118
00:06:20,560 --> 00:06:23,760
Within like 45 days.
It was completely different.
119
00:06:24,200 --> 00:06:27,320
So you sort of roll that into
what we're doing at L&D Space
120
00:06:27,320 --> 00:06:29,440
and try to keep up with that.
It's mind boggling.
121
00:06:30,480 --> 00:06:34,280
I think though, in regards to
that, the the the greatest
122
00:06:34,280 --> 00:06:38,280
challenges we have are around
getting learning to the people
123
00:06:38,280 --> 00:06:41,120
and the places they are and how
they want to inter operate with
124
00:06:41,120 --> 00:06:43,520
it.
So, you know, if you're using a
125
00:06:43,520 --> 00:06:47,160
lot of the AI tool in note, it's
almost become a conversational
126
00:06:47,160 --> 00:06:51,400
kind of piece where you're not
browsing the web for pages
127
00:06:51,400 --> 00:06:52,800
anymore.
You're not going to find that
128
00:06:52,920 --> 00:06:54,480
learning content in the same
way.
129
00:06:54,920 --> 00:06:58,040
It's either being surfaced to
you or it's kind of answering
130
00:06:58,040 --> 00:07:01,920
direct questions or giving you
guidance and personalized advice
131
00:07:01,920 --> 00:07:06,280
around that.
So it's how do we restructure
132
00:07:06,280 --> 00:07:10,440
how training and learning is
presented back to the learner in
133
00:07:10,440 --> 00:07:14,320
a way that kind of matches what
effectively this this future of
134
00:07:14,320 --> 00:07:16,200
kind of connected technologies
look like.
135
00:07:16,680 --> 00:07:19,520
It's going to be fun.
It's going to be, it's going to
136
00:07:19,520 --> 00:07:24,080
be rough, I think for a while.
Yeah, I agree with you 100%.
137
00:07:24,320 --> 00:07:27,120
So tell us about thirst.
Like what it, what is thirst?
138
00:07:27,120 --> 00:07:30,120
What do you do?
How do you help people in the
139
00:07:30,120 --> 00:07:32,360
L&D space and tell us all about
it.
140
00:07:32,680 --> 00:07:34,000
Tell us what it what you're up
to.
141
00:07:34,600 --> 00:07:36,600
Yeah, no problem.
Well, when we built first we
142
00:07:36,600 --> 00:07:39,680
sort of spotted the gap in the
market for trying to use some of
143
00:07:39,680 --> 00:07:41,440
these technologies to start.
We're trying to.
144
00:07:41,440 --> 00:07:44,080
So we baked in personalisation
right from all the way through
145
00:07:44,080 --> 00:07:45,920
the product.
So everything you do in there
146
00:07:45,920 --> 00:07:48,480
is, is surfacing the right
latest information that's
147
00:07:48,480 --> 00:07:50,920
relevant to you and we're sort
of accounting for everything
148
00:07:50,920 --> 00:07:53,320
that you do.
So every sort of interaction
149
00:07:53,320 --> 00:07:56,400
like comment, share, if you're
certain authors or certain
150
00:07:56,400 --> 00:07:59,000
content types where we're
forming that opinion of what
151
00:07:59,000 --> 00:08:02,360
content and then that gets
surfaced to at the right point.
152
00:08:02,720 --> 00:08:05,880
We also baked in skills
completely from the ground up,
153
00:08:05,880 --> 00:08:08,440
which again, depending whether
you know you're learning
154
00:08:08,440 --> 00:08:10,760
platforms or not, you know you,
they tended to be an
155
00:08:10,760 --> 00:08:12,680
afterthought.
And a lot of the more kind of
156
00:08:12,680 --> 00:08:15,720
previous incarnations of
learning platforms tacked on the
157
00:08:15,720 --> 00:08:17,560
end.
We reversed that and said we're
158
00:08:17,560 --> 00:08:20,640
going to start there, which
meant we sort of was forward
159
00:08:20,640 --> 00:08:24,560
thinking from the get go in
terms of what the future of L&D
160
00:08:24,560 --> 00:08:28,080
looks like, the future of just
business really on how it's
161
00:08:28,080 --> 00:08:31,840
going to train and hire people.
And then obviously we sort of
162
00:08:31,840 --> 00:08:35,280
rolled that way from the AI side
as well, which is made that
163
00:08:35,280 --> 00:08:37,000
we've put some really nice
features and functionality in
164
00:08:37,000 --> 00:08:39,280
there as well.
But one of our core method
165
00:08:39,520 --> 00:08:43,960
messages really is just making
the platform that we started off
166
00:08:43,960 --> 00:08:47,080
with a messaging which was a
platform you want to use because
167
00:08:47,360 --> 00:08:48,880
we found a lot of learning
platforms.
168
00:08:48,880 --> 00:08:51,040
You almost need a training
program to use the platform.
169
00:08:51,320 --> 00:08:53,600
It's too difficult, you know,
you couldn't quite training.
170
00:08:53,600 --> 00:08:55,920
Programs like.
This is it.
171
00:08:56,480 --> 00:08:58,680
Intermediate Advanced.
Yeah.
172
00:08:58,680 --> 00:09:00,200
And nothing felt intuitive
either.
173
00:09:00,200 --> 00:09:02,080
You'd sort of go off and the
administration area was
174
00:09:02,080 --> 00:09:03,920
somewhere else and it looked
very different.
175
00:09:03,920 --> 00:09:06,200
And, and that was, we came at it
completely fresh and said, OK,
176
00:09:06,200 --> 00:09:08,040
we're going to build these
latest technologies and tools in
177
00:09:08,040 --> 00:09:08,840
there.
We're going to come from
178
00:09:08,840 --> 00:09:10,400
schools, from the from the
ground up.
179
00:09:10,880 --> 00:09:12,600
And we're also going to make it
very usable.
180
00:09:12,960 --> 00:09:16,400
And then we also lay it in like
a social learning piece as well.
181
00:09:16,400 --> 00:09:18,680
So there's a lot around tacit
knowledge sharing.
182
00:09:19,080 --> 00:09:23,000
By default, everybody in first
can actually create content and
183
00:09:23,000 --> 00:09:25,320
service it to their colleagues
in the business as well.
184
00:09:25,640 --> 00:09:28,960
So that means you're getting,
especially for your sort of SME
185
00:09:28,960 --> 00:09:32,880
size of business, it's brilliant
because you've got smaller L&D
186
00:09:32,880 --> 00:09:35,560
teams or sometimes non existent
teams.
187
00:09:36,800 --> 00:09:38,920
And it means that other people
in the business can effectively
188
00:09:38,920 --> 00:09:42,680
still facilitate training with
the organization, get the right
189
00:09:42,680 --> 00:09:44,880
knowledge to the right people
without it sort of being
190
00:09:44,880 --> 00:09:47,640
bottlenecked by, you know, a
smaller team size or whatever
191
00:09:47,640 --> 00:09:49,600
that it might be.
So yeah, that's the other
192
00:09:49,600 --> 00:09:52,160
problems we're solving, and
solving them pretty nicely at
193
00:09:52,160 --> 00:09:54,040
the moment.
That's good.
194
00:09:55,000 --> 00:09:56,640
You know, I love it.
Like I said, I love your
195
00:09:56,640 --> 00:09:58,320
marketing.
I love what you're doing.
196
00:09:58,320 --> 00:10:02,160
And you know, I think that one
of the things that's come out of
197
00:10:02,160 --> 00:10:06,320
like AI is this more
personalized learning experience
198
00:10:06,520 --> 00:10:09,840
instead of like just everybody
does the same thing through the
199
00:10:09,840 --> 00:10:13,240
whole semester, the whole
training or things like that.
200
00:10:13,240 --> 00:10:16,720
Like you're meeting people where
they're at in terms like, I
201
00:10:16,720 --> 00:10:19,320
wouldn't want to sit through a
training where it talks about
202
00:10:19,720 --> 00:10:22,680
instructional design, theories
and methodologies.
203
00:10:22,680 --> 00:10:24,520
I already know all this stuff.
I want to sit there training
204
00:10:24,520 --> 00:10:27,160
where it's something that's
instant application.
205
00:10:27,600 --> 00:10:30,760
I can instantly use it.
So what do you think about the
206
00:10:30,760 --> 00:10:32,840
more personalized learning
experience in house?
207
00:10:32,840 --> 00:10:34,240
There's kind of implementing
that.
208
00:10:35,920 --> 00:10:38,560
I think it's critical, I think I
mean, I don't have to study
209
00:10:38,680 --> 00:10:41,080
percentages to hand, but there's
some great studies which show
210
00:10:41,080 --> 00:10:43,600
effect about the effectiveness
of personalized learning and how
211
00:10:43,640 --> 00:10:46,440
that sort of impacts people's
retention and ability to kind of
212
00:10:46,440 --> 00:10:49,840
learn.
I think it's funny again and you
213
00:10:49,840 --> 00:10:52,480
what you started to kind of kind
of podcast around the questions
214
00:10:52,480 --> 00:10:54,640
around how technology sort of
shaping that.
215
00:10:55,080 --> 00:11:00,600
And I sort of remember it was
probably only four years ago or
216
00:11:00,640 --> 00:11:01,800
so.
So where they were basically we
217
00:11:01,800 --> 00:11:05,240
couldn't quite I think was
filtered, the company filtered.
218
00:11:05,240 --> 00:11:07,960
We're trying to actually sort of
personalize the the content and
219
00:11:07,960 --> 00:11:10,120
trying to establish and
understand the content and map
220
00:11:10,120 --> 00:11:12,720
that to skill levels and certain
types of skills.
221
00:11:12,720 --> 00:11:15,160
And they were having real
challenges with that at that
222
00:11:15,160 --> 00:11:17,320
point and trying to trying to
make it work really, really
223
00:11:17,320 --> 00:11:19,800
well.
And then obviously the AI side
224
00:11:19,800 --> 00:11:22,560
came into that and all of a
sudden it just sort of replaced
225
00:11:22,560 --> 00:11:25,080
that problem with just a
solution straight away.
226
00:11:25,080 --> 00:11:28,400
Just there you go, we can do
that now, which is formidable.
227
00:11:28,560 --> 00:11:31,880
And when you then team up with
the personalisation side, that
228
00:11:31,880 --> 00:11:34,760
automatic mapping of content,
basically if you can get that
229
00:11:34,840 --> 00:11:39,080
established level of skill level
as well of that content, then it
230
00:11:39,080 --> 00:11:41,880
just changes it again.
And so yeah, personalized
231
00:11:41,880 --> 00:11:43,720
learning journeys, like
personalized journey through.
232
00:11:43,720 --> 00:11:45,760
And we're taking into account,
we're trying to take into
233
00:11:45,760 --> 00:11:48,400
account everything.
So it can even be, you know,
234
00:11:48,400 --> 00:11:51,000
what content source it's coming
from, if there's a certain
235
00:11:51,000 --> 00:11:55,280
author within your organization
who you kind of follow or aspire
236
00:11:55,280 --> 00:11:58,160
to kind of look towards for a
career pathway that can also do
237
00:11:58,160 --> 00:12:00,160
that.
But it's also factoring in, you
238
00:12:00,160 --> 00:12:02,240
know, the more basic elements
like what skills you're seeking
239
00:12:02,240 --> 00:12:05,120
and things like that.
All that combined, you get a lot
240
00:12:05,120 --> 00:12:09,480
more just impact of the delivery
of the of the training really.
241
00:12:10,120 --> 00:12:12,240
Yeah, absolutely.
And I'm just thinking it back,
242
00:12:12,720 --> 00:12:15,280
like, remember how much time we
would spend on doing things like
243
00:12:15,400 --> 00:12:18,120
descriptions and building out
content now?
244
00:12:18,400 --> 00:12:20,560
Tagging, Tagging content, yeah,
yes.
245
00:12:21,080 --> 00:12:25,560
And tagging content it I just,
it just literally like shifted
246
00:12:26,040 --> 00:12:28,760
overnight.
I love it though, because I feel
247
00:12:28,760 --> 00:12:32,200
like, you know, AI and some,
some other tools have really
248
00:12:32,200 --> 00:12:36,120
allowed us to be instructional
designers or LMD professionals
249
00:12:36,120 --> 00:12:38,160
and jump into the creative
space.
250
00:12:38,160 --> 00:12:42,680
Now like we get, we get really
like caught up in some of the
251
00:12:42,680 --> 00:12:45,600
administrative tasks that take
like they're very time
252
00:12:45,600 --> 00:12:47,200
consuming.
They might be small tasks, but
253
00:12:47,200 --> 00:12:51,160
they become very time consuming.
And I'm thinking of like when
254
00:12:51,160 --> 00:12:54,080
you're building a coursing or
you're building a workshop or
255
00:12:54,080 --> 00:12:56,280
something and you have to
outline everything and then you
256
00:12:56,280 --> 00:12:59,200
have to design it and then you
got to deploy it and all that
257
00:12:59,200 --> 00:13:02,120
stuff.
But really like pieces of that
258
00:13:02,120 --> 00:13:07,360
have have changed because of AI.
And I feel like I can really sit
259
00:13:07,360 --> 00:13:10,680
in a creative space now and
think about other things I could
260
00:13:10,680 --> 00:13:14,320
do during the live workshop that
are going to impact the learner,
261
00:13:14,320 --> 00:13:19,320
engage them or offer pieces of
advice or scenarios during that
262
00:13:19,320 --> 00:13:20,800
training.
Yeah.
263
00:13:20,800 --> 00:13:23,720
It's part of the.
Yeah, it's part of the L&D
264
00:13:23,720 --> 00:13:25,840
Unleashed event that we were
talking about before.
265
00:13:25,840 --> 00:13:26,920
Can I ask?
About.
266
00:13:27,200 --> 00:13:32,400
That yeah, it's we did.
I trusted myself in a live demo
267
00:13:32,400 --> 00:13:35,840
of how to use AI as your
copilot, which is always quite
268
00:13:35,840 --> 00:13:37,560
dangerous.
It worked all right, actually
269
00:13:37,560 --> 00:13:39,880
largely speaking.
But what we were doing is we
270
00:13:39,880 --> 00:13:42,720
were just giving it information
and using some of the sort of
271
00:13:42,720 --> 00:13:47,160
reasoning models that are there
now and saying can you kind of
272
00:13:47,160 --> 00:13:49,720
give me the job roles that this
business?
273
00:13:49,720 --> 00:13:51,960
And we would name the business
and give it some, some pointers
274
00:13:51,960 --> 00:13:55,040
on the web, you know, to a
website and it would go off and
275
00:13:55,040 --> 00:13:57,200
try and bring back the list of
probable job roles.
276
00:13:57,200 --> 00:14:00,400
And it would look at the job
positions that are on its
277
00:14:00,400 --> 00:14:02,240
careers page and trying to map
them back.
278
00:14:02,520 --> 00:14:04,320
And then we extended that out
and said, well, can you give me
279
00:14:04,320 --> 00:14:06,280
the skills of these job roles?
Then can you give me the tiers
280
00:14:06,280 --> 00:14:07,720
of the job roles and how the
tiers mapped?
281
00:14:07,880 --> 00:14:10,760
And we basically built skills,
matrices and frameworks out of
282
00:14:10,760 --> 00:14:13,520
it by just pointing it to the
direction of business
283
00:14:13,640 --> 00:14:15,360
information and sort of asking
questions.
284
00:14:15,840 --> 00:14:18,240
And then even took that further
and said, OK, where's the skill
285
00:14:18,240 --> 00:14:20,120
gaps that we need?
And also can you give me some
286
00:14:20,120 --> 00:14:21,920
titles?
Can you give me a strategy to
287
00:14:22,600 --> 00:14:25,360
teach that skill gap to sort of
to close that gap?
288
00:14:26,040 --> 00:14:28,360
You know, we're not going as far
as creating all the content
289
00:14:28,360 --> 00:14:30,720
here.
And part of the talk as well was
290
00:14:31,000 --> 00:14:33,800
this doesn't replace, you know,
instructors, design or L&D kind
291
00:14:33,800 --> 00:14:35,480
of professionals.
A piece of it, yeah.
292
00:14:35,560 --> 00:14:37,640
People are so nervous about
that, Fred.
293
00:14:37,640 --> 00:14:41,320
They're so they're, they're like
nervous to a point where it's
294
00:14:41,320 --> 00:14:45,480
like it's, it's really
negatively impacting their view
295
00:14:45,480 --> 00:14:49,120
of the tool and how to use it.
Yeah, almost a bit scared that
296
00:14:49,120 --> 00:14:50,040
it's going to replace their
jobs.
297
00:14:50,040 --> 00:14:51,120
So they're kind of trying to
ignore it.
298
00:14:51,680 --> 00:14:54,880
Yeah, we know kind of the human
connection in L&D is so
299
00:14:54,880 --> 00:14:57,560
important anyway and there's a
lot of that side of it that
300
00:14:57,560 --> 00:14:59,640
really still matters.
But you can't simply replace
301
00:14:59,640 --> 00:15:02,160
that knowledge either.
And we are sort of saying it's
302
00:15:02,160 --> 00:15:04,520
more of accelerant.
It's kind of like it's what you
303
00:15:04,520 --> 00:15:07,240
talked about there, which is
that kind of yet stuck in the
304
00:15:07,240 --> 00:15:09,400
mundane, in the kind of
administrative side.
305
00:15:09,640 --> 00:15:12,480
Well, all of a sudden you can
just clear a lot of that and
306
00:15:12,480 --> 00:15:14,240
actually really focus on where
the added value is.
307
00:15:14,240 --> 00:15:17,360
And that's really exciting.
I think it's going to be
308
00:15:17,360 --> 00:15:20,920
challenging to make sure
everybody understands how to use
309
00:15:20,920 --> 00:15:23,000
it because it's a brand new
skill, prompt engineering,
310
00:15:23,000 --> 00:15:25,480
etcetera.
And then also having the
311
00:15:25,520 --> 00:15:28,600
businesses and organisations
approve its use because there's
312
00:15:28,880 --> 00:15:30,400
data protection elements to
that.
313
00:15:30,880 --> 00:15:33,200
But I think I've referenced this
a few times.
314
00:15:33,200 --> 00:15:36,360
I think it's like when everybody
was moving to the cloud sort of
315
00:15:36,360 --> 00:15:39,680
whatever it was 10 years ago and
everyone's really scared of it
316
00:15:39,680 --> 00:15:41,840
and it was like we can't do it.
And you know, it's there's
317
00:15:42,040 --> 00:15:45,400
problems with security and
safety and then we're just all
318
00:15:45,400 --> 00:15:47,320
there now and it sort of went
away.
319
00:15:47,680 --> 00:15:49,080
And I kind of feel that's where
we're at.
320
00:15:49,080 --> 00:15:51,560
We're at that early stage, only
a few years into this journey on
321
00:15:51,560 --> 00:15:54,440
on that new tech side.
I think in sort of Fast forward
322
00:15:54,440 --> 00:15:56,360
in the five years, it'll become
so ubiquitous.
323
00:15:56,360 --> 00:15:58,640
We never understood how we.
Won't even notice.
324
00:15:58,720 --> 00:15:59,920
It's just gonna be there.
Yeah.
325
00:15:59,960 --> 00:16:02,440
Like it's, it's just a part of
everything that we do.
326
00:16:04,240 --> 00:16:06,480
Absolutely.
I think that's 100% going to
327
00:16:06,480 --> 00:16:08,360
happen.
And whether people like it or
328
00:16:08,360 --> 00:16:10,840
not, like these technologies,
like the cloud and stuff,
329
00:16:10,840 --> 00:16:14,320
that's, that's the norm, that
becomes the norm for everything.
330
00:16:14,320 --> 00:16:16,800
That's what you should be using
to save your stuff on.
331
00:16:16,800 --> 00:16:19,360
And my grandparents are in
their, their mid 80s.
332
00:16:19,360 --> 00:16:20,960
They're like, where is this
cloud?
333
00:16:22,200 --> 00:16:25,720
And I'm like, it's not a cloud.
It's not a cloud in the sky.
334
00:16:25,840 --> 00:16:28,400
It's, it's out on a server
somewhere.
335
00:16:28,440 --> 00:16:31,000
So you know.
I do have sympathy though, a
336
00:16:31,000 --> 00:16:32,520
little bit with.
With I mean.
337
00:16:33,240 --> 00:16:34,840
Yeah, I'm, I'm certainly getting
older.
338
00:16:34,840 --> 00:16:36,520
I mean, as we all are, we don't
go backwards.
339
00:16:36,760 --> 00:16:40,000
But I mean, we've lived through
quite a, a generational kind of
340
00:16:40,000 --> 00:16:42,280
shift in terms of technology.
But you kind of imagine, you
341
00:16:42,280 --> 00:16:44,400
know, our parents and what
they've had to kind of come from
342
00:16:44,400 --> 00:16:47,160
from almost not having, you
know, you know, televisions and
343
00:16:47,160 --> 00:16:49,800
microwaves would just be coming
out as a as a thing that even
344
00:16:49,800 --> 00:16:51,160
existed.
And then to suddenly dealing
345
00:16:51,160 --> 00:16:54,240
with AI, the shift in their
lifestyle lifetime.
346
00:16:54,240 --> 00:16:55,920
Is it incredible, isn't it, to
try and understand?
347
00:16:55,960 --> 00:16:57,720
It really is.
But they have me, so they're
348
00:16:57,720 --> 00:16:58,680
fine.
I mean, they have.
349
00:16:58,960 --> 00:17:02,200
IPads, they have smart.
TV's, every time I go there, my
350
00:17:02,200 --> 00:17:04,680
grandfather has a list of
technologies he with things he
351
00:17:04,680 --> 00:17:06,200
wants to learn.
So they're fine.
352
00:17:06,200 --> 00:17:09,200
They can operate, you know,
better than some some other
353
00:17:09,200 --> 00:17:11,920
people I know.
But yeah, I really do have the
354
00:17:11,960 --> 00:17:15,920
the empathy for for the process.
Like when I was growing up, the
355
00:17:15,920 --> 00:17:18,640
Internet was just coming out and
chat rooms were like the big
356
00:17:18,640 --> 00:17:20,720
thing.
Like I remember we got our first
357
00:17:20,720 --> 00:17:22,640
computer.
It was a Gateway computer in the
358
00:17:22,640 --> 00:17:24,040
United States.
And if you know gateway, it's
359
00:17:24,040 --> 00:17:28,560
like you rent those computers
and we just sat there, couldn't
360
00:17:28,560 --> 00:17:31,240
use the phone when you're on the
Internet.
361
00:17:31,840 --> 00:17:34,680
You know, we've been through
that stuff, which is really cool
362
00:17:34,680 --> 00:17:38,120
because I, I think it sets like
the foundation of what we know,
363
00:17:38,120 --> 00:17:41,040
like kind of learning long
division before, you know, short
364
00:17:41,040 --> 00:17:43,440
division.
So it really gives you that
365
00:17:43,440 --> 00:17:47,040
perspective.
And like these kids nowadays,
366
00:17:47,040 --> 00:17:51,040
these, these, these young UNS,
as we say in the South, they
367
00:17:51,040 --> 00:17:56,400
have no idea, no idea what it
was like to like not have a cell
368
00:17:56,400 --> 00:17:58,680
phone with you all the time,
like coming home when the
369
00:17:58,680 --> 00:18:02,640
streets St. lights turn on.
They have no idea.
370
00:18:03,360 --> 00:18:04,320
Yeah, I know.
Yeah.
371
00:18:04,320 --> 00:18:06,680
I mean, I'm, I'm not going to
make a judgement wholly on your
372
00:18:06,680 --> 00:18:09,000
age, but I, I feel like we grew
up in a very similar time
373
00:18:09,000 --> 00:18:10,280
because these are my memories as
well.
374
00:18:10,280 --> 00:18:12,280
But we do.
Too, which is awesome.
375
00:18:13,440 --> 00:18:14,920
Yeah, it's good.
There are always challenges.
376
00:18:14,920 --> 00:18:17,040
I I think there's a certain
element of retro as well that
377
00:18:17,040 --> 00:18:18,360
people are heading back towards
that.
378
00:18:18,360 --> 00:18:20,840
I, I hear that latest Gen.
Z are actually seeking out kind
379
00:18:20,840 --> 00:18:23,240
of like landline phones again
because they kind of want to get
380
00:18:23,240 --> 00:18:24,880
disconnected.
So it's, it's interesting.
381
00:18:24,880 --> 00:18:27,840
It goes full, full circle, yeah.
I love their brains on the way
382
00:18:27,840 --> 00:18:29,720
that they think and the way
they're changing the work
383
00:18:29,720 --> 00:18:32,800
culture as well.
You know, we're having a
384
00:18:32,800 --> 00:18:35,200
conversation recently about like
Gen.
385
00:18:35,200 --> 00:18:37,800
Z, they're saying like Gen.
Z didn't want to work.
386
00:18:38,120 --> 00:18:40,000
Well, that's not the case.
They work differently.
387
00:18:40,000 --> 00:18:44,160
So we're not used to, you know,
working the 9:00 to 5:00, which
388
00:18:44,160 --> 00:18:47,240
is an archaic, outdated process
anyways.
389
00:18:48,520 --> 00:18:51,560
So it's really cool to to tap
into their brains and their
390
00:18:51,560 --> 00:18:53,920
minds and see how they're going
to, they're going to change the
391
00:18:53,920 --> 00:18:56,600
game for sure.
Yeah, I think so.
392
00:18:56,600 --> 00:19:00,080
I mean there's they are a
different approach.
393
00:19:00,320 --> 00:19:03,320
They have a different approach,
I think to they're almost more
394
00:19:03,320 --> 00:19:08,960
demanding of kind of excellence
almost to some degree, right.
395
00:19:09,200 --> 00:19:11,960
It's kind of like the boundaries
that we should have set and
396
00:19:11,960 --> 00:19:14,880
expectations we should have set.
We didn't do so they're doing
397
00:19:14,880 --> 00:19:17,120
it.
Yeah, a little bit like we, we
398
00:19:17,120 --> 00:19:19,440
can't get it, you know, in the
same way why we built first is
399
00:19:19,440 --> 00:19:22,040
because it's kind of like the
the, the market is.
400
00:19:22,120 --> 00:19:26,080
You wouldn't, you couldn't put
an old learning platform to the
401
00:19:26,080 --> 00:19:27,480
Gen.
Z audience because they just
402
00:19:27,480 --> 00:19:29,840
simply wouldn't accept it.
It would be the more demanding
403
00:19:29,840 --> 00:19:32,000
of what the quality of
everything that gets delivered
404
00:19:32,000 --> 00:19:33,960
to them, the quality of the
businesses they interoperate
405
00:19:33,960 --> 00:19:36,080
with, you know, in terms of like
the sustainability, all the
406
00:19:36,080 --> 00:19:38,680
pledges, all the kind of the B
Corp elements, all these, these
407
00:19:38,680 --> 00:19:40,920
parts.
They've got more demands and
408
00:19:40,920 --> 00:19:43,640
expectations on it, which I
think is is only a good thing
409
00:19:43,640 --> 00:19:45,640
realistically.
So it drives everybody forward.
410
00:19:46,280 --> 00:19:48,760
Absolutely.
So what are you excited about
411
00:19:48,760 --> 00:19:51,240
for the future?
Like the future of learning and
412
00:19:51,240 --> 00:19:54,160
development, Like with all this
AI stuff coming in, what are you
413
00:19:54,160 --> 00:19:58,800
excited about?
So we're trying to go to this
414
00:19:58,800 --> 00:20:01,000
kind of the same ground, but
kind of like, but my, my
415
00:20:01,000 --> 00:20:04,920
thoughts is about what we can do
faster and what we can
416
00:20:05,480 --> 00:20:09,920
understand that we either a
couldn't understand before or B
417
00:20:09,920 --> 00:20:13,360
took too long to understand.
So a a great example is the data
418
00:20:13,360 --> 00:20:17,200
analysis part of L&D.
You know, historically we talked
419
00:20:17,200 --> 00:20:21,000
about SCORM, SCORM one point
whatever 1 and 1.1 and 1.2,
420
00:20:21,000 --> 00:20:24,280
etcetera was all focused on did
you complete it?
421
00:20:24,480 --> 00:20:28,560
And if so, did you pass and
maybe it stretched?
422
00:20:28,560 --> 00:20:32,240
How long did it take?
And we brought in, you know, so
423
00:20:32,240 --> 00:20:34,200
many other metrics that you
could do with objectives and
424
00:20:34,200 --> 00:20:37,120
everything else, but it was, it
was almost too difficult to set
425
00:20:37,120 --> 00:20:41,120
up and interpret.
And now you've got a scenario
426
00:20:41,120 --> 00:20:44,080
where you can track every single
data point, you know, every kind
427
00:20:44,080 --> 00:20:47,040
of eyeball click, movement,
etcetera.
428
00:20:47,520 --> 00:20:52,240
And you can establish patterns
and pathways and data points on
429
00:20:52,240 --> 00:20:54,240
that and that can inform the the
strategy.
430
00:20:54,240 --> 00:20:57,440
So I mean a really simple level
and this is, this is nowhere
431
00:20:57,440 --> 00:21:00,320
near as far as it is going and
can go and can go.
432
00:21:00,680 --> 00:21:05,960
But we are spotting gaps in
content that's not available for
433
00:21:05,960 --> 00:21:08,760
certain skills that people are
seeking in the business because
434
00:21:08,760 --> 00:21:11,080
they've we're using all the data
points about what they're
435
00:21:11,080 --> 00:21:13,880
searching and what skills they
have versus how many users we
436
00:21:13,880 --> 00:21:15,680
have.
And we can map that back to
437
00:21:15,680 --> 00:21:19,000
saying, well, the content is not
matching the requirements there.
438
00:21:19,320 --> 00:21:22,760
That was even that sounding
simplistic was relatively
439
00:21:22,760 --> 00:21:25,520
difficult to do previously
because you had to have so much
440
00:21:25,520 --> 00:21:29,160
data and sort of transpose that.
And now it's just in systems,
441
00:21:29,160 --> 00:21:32,520
it's in platforms to do that.
And that's exciting to me all
442
00:21:32,520 --> 00:21:36,120
the time.
What else can we discover that
443
00:21:36,120 --> 00:21:40,120
we didn't discover?
And I'm also a weird advocate,
444
00:21:40,120 --> 00:21:42,000
sort of leaning away from the AI
side a second.
445
00:21:42,000 --> 00:21:45,920
I'm a weird advocate of how
marketing can be used really
446
00:21:45,920 --> 00:21:50,400
well psychologically to kind of
engage you on the L&D side as
447
00:21:50,400 --> 00:21:52,560
well.
And we've talked about this
448
00:21:52,560 --> 00:21:54,680
before on, on one of the
presentations I've previously
449
00:21:54,680 --> 00:21:59,120
done, but it was around you can
build scarcity, you can do FOMO,
450
00:21:59,120 --> 00:22:01,840
you can do these elements of are
really popular in things like
451
00:22:01,840 --> 00:22:04,000
social media.
They really draw you in, they
452
00:22:04,000 --> 00:22:05,760
draw you back and you open up
the app every day.
453
00:22:05,760 --> 00:22:08,680
You want to see what's happening
and they work across the board.
454
00:22:08,680 --> 00:22:11,720
It doesn't really matter that
it's L&D, but I think we're just
455
00:22:11,720 --> 00:22:16,120
getting smarter with how to
engage and, and build that kind
456
00:22:16,120 --> 00:22:18,120
of interaction with our learners
that I don't think we had
457
00:22:18,120 --> 00:22:21,080
before.
So these things are where I like
458
00:22:21,080 --> 00:22:22,400
how we're heading and where
we're going to.
459
00:22:23,000 --> 00:22:25,280
Yeah, me too.
And I think that anybody that's
460
00:22:25,320 --> 00:22:27,880
thinking about getting an L&D or
thinking about getting into
461
00:22:27,880 --> 00:22:30,760
marketing or whatever they're
trying to do, you need to take a
462
00:22:30,760 --> 00:22:33,640
psychology course for sure.
And kind of learned some of
463
00:22:33,640 --> 00:22:36,320
those tactics.
Like, I didn't realize how much
464
00:22:36,320 --> 00:22:41,080
psychology is, is an L&D and is
a marked mean, like you said,
465
00:22:41,080 --> 00:22:43,320
the FOMO stuff.
And you're really getting deep
466
00:22:43,320 --> 00:22:46,600
into the psyche of like, you
know, why do people sit on
467
00:22:46,600 --> 00:22:49,920
TikTok like myself at bed and
scroll for hours and hours a
468
00:22:49,920 --> 00:22:51,360
night?
What's what's there?
469
00:22:51,360 --> 00:22:54,200
What's doing that?
That's like become an addiction,
470
00:22:54,640 --> 00:22:57,640
you know, just coming.
Back I've got a really simple
471
00:22:57,640 --> 00:23:01,320
kind of example that kind of
there's a real world example
472
00:23:01,320 --> 00:23:03,480
which it shows how, how well
this works and then how you can
473
00:23:03,480 --> 00:23:07,720
apply that to L&D, which is we
always want to be kind of herd
474
00:23:07,720 --> 00:23:10,040
mentalities as humans.
That's kind of how we operate.
475
00:23:10,440 --> 00:23:14,720
And the hotels, we've found it
more successful if they want to
476
00:23:14,960 --> 00:23:17,320
save on the amount of laundry
they're doing on the, on the
477
00:23:17,480 --> 00:23:20,320
towel washing, they'll put a
sign in the, in the bedroom to
478
00:23:20,320 --> 00:23:22,240
say, you know, to tell people to
do that.
479
00:23:22,400 --> 00:23:25,880
But they would have more success
if they would put a statistic on
480
00:23:25,880 --> 00:23:30,400
there that says, join your other
fellow guests and 70% of them
481
00:23:30,400 --> 00:23:32,880
reuse their towel or whatever
the status.
482
00:23:32,920 --> 00:23:35,720
And basically it, it makes
everybody else think, oh, if
483
00:23:35,720 --> 00:23:37,080
everybody else is doing it, I'm
doing it.
484
00:23:37,080 --> 00:23:38,840
And then suddenly you get better
uptake of it.
485
00:23:39,240 --> 00:23:41,000
Now it incredible.
It's really simple.
486
00:23:41,000 --> 00:23:43,200
Just the messaging on the sign
changes that behaviour.
487
00:23:43,640 --> 00:23:45,720
If you apply that to L&D, you
could have that.
488
00:23:45,800 --> 00:23:48,040
At Disney, Fred, they do that at
Disney.
489
00:23:48,720 --> 00:23:50,240
This is there you go the Disney
conversation.
490
00:23:50,960 --> 00:23:53,200
We'll not go on Disney.
We'll we'll be brunching off in
491
00:23:53,200 --> 00:23:54,720
Disney conversations and then
we'll come back.
492
00:23:55,680 --> 00:23:58,720
But at the L&D side, I mean, if
you want them to your learners
493
00:23:58,720 --> 00:24:01,360
to do a certain course and then
follow up with a second course,
494
00:24:01,360 --> 00:24:03,480
for instance, you can say, well,
70% of the people who've
495
00:24:03,480 --> 00:24:06,440
completed this course go on to
do this course or take the skill
496
00:24:06,440 --> 00:24:09,240
next or whatever.
And just that simple nudge and,
497
00:24:09,280 --> 00:24:12,480
and wording can change the
engagement and how they're kind
498
00:24:12,480 --> 00:24:15,920
of appeal towards it can just
just completely switches.
499
00:24:15,920 --> 00:24:17,960
So I love it.
I think it's great if we get
500
00:24:17,960 --> 00:24:20,080
really small with this.
I think we can have much more
501
00:24:20,640 --> 00:24:23,760
engagement impact, which is
number one challenge all day,
502
00:24:23,760 --> 00:24:28,040
every day in L&D, everywhere,
everyone you speak to 1. 100% So
503
00:24:28,040 --> 00:24:30,480
there's a lot of people in our
audience who are transitioning
504
00:24:30,480 --> 00:24:33,480
into different roles or we have
a lot of transitioning teachers,
505
00:24:33,480 --> 00:24:37,520
L&D professionals.
You've, you've, you've done tons
506
00:24:37,520 --> 00:24:40,280
of stuff.
You're, you're successful in
507
00:24:40,280 --> 00:24:44,520
more ways than, than I can say
or words that I have for.
508
00:24:44,880 --> 00:24:48,520
So for people who are listening,
what are like 3 pieces of advice
509
00:24:48,520 --> 00:24:50,680
you could give to them?
If they're maybe looking to be
510
00:24:50,680 --> 00:24:54,320
an entrepreneur, they're maybe
looking to get L&D into L&D,
511
00:24:54,640 --> 00:24:58,320
what should they do?
It's a great question.
512
00:24:58,320 --> 00:25:02,120
I'm going to going to start with
saying the American attitude is
513
00:25:02,120 --> 00:25:04,640
always a little bit more
embellished with the idea of
514
00:25:05,040 --> 00:25:08,800
shouting about your own kind of
achievements and things.
515
00:25:08,800 --> 00:25:12,040
And I find it difficult enough
to call myself an entrepreneur,
516
00:25:12,040 --> 00:25:14,120
never mind a Syrian or
entrepreneur for quite a while.
517
00:25:14,920 --> 00:25:17,800
But that's the British way.
We're going a bit more reserved
518
00:25:17,800 --> 00:25:19,160
in that respect.
We should shake more.
519
00:25:19,160 --> 00:25:21,240
We should, definitely should.
But yeah, we have done quite a
520
00:25:21,240 --> 00:25:25,680
while, done quite a bit.
I tell you what, my kind of kind
521
00:25:25,680 --> 00:25:28,320
of key things from the founder
side, I think that were the most
522
00:25:28,320 --> 00:25:35,000
surprising 1 is probably the
risk of starting something that
523
00:25:35,000 --> 00:25:38,040
is even like your own business
or even a jump to a different
524
00:25:38,040 --> 00:25:42,120
role is generally a lot lower
than you think it is.
525
00:25:42,200 --> 00:25:45,920
And it, it feels scarier because
there's a, a, you know, mortgage
526
00:25:46,520 --> 00:25:48,120
sort of loan, you know, money on
the line.
527
00:25:48,760 --> 00:25:51,800
But the, a lot of people who are
doing this are generally the
528
00:25:51,800 --> 00:25:55,320
people who are very skilled and
skilled already and know the
529
00:25:55,320 --> 00:25:57,720
likeliness of what, what their
ability is and, you know, where
530
00:25:57,720 --> 00:26:01,360
they can apply it.
And ultimately, if it doesn't
531
00:26:01,360 --> 00:26:04,280
work, in many cases you can, you
can get back into another job
532
00:26:04,280 --> 00:26:06,400
and another role again quite
fast in many cases because then
533
00:26:06,400 --> 00:26:10,440
you're a skilled professional.
So I, I found that it, there's
534
00:26:10,440 --> 00:26:12,680
not as many people who want to
take the risk that I take to go
535
00:26:12,680 --> 00:26:13,960
and start the businesses I've
done.
536
00:26:14,240 --> 00:26:18,280
But I think if you kind of frame
it differently, I think it
537
00:26:18,280 --> 00:26:20,320
becomes less of a challenge to
do So I think you can think,
538
00:26:20,320 --> 00:26:22,520
well, worst case, we can run
this for three months, six
539
00:26:22,520 --> 00:26:24,440
months and it doesn't work.
We go get that job again, we
540
00:26:24,440 --> 00:26:26,200
will be able to get a job.
We're pretty confident of that.
541
00:26:26,200 --> 00:26:28,160
We've moved.
We've done so successfully so
542
00:26:28,160 --> 00:26:31,560
far.
So removing that fear I think is
543
00:26:31,560 --> 00:26:34,080
1 huge fear there.
For people.
544
00:26:35,040 --> 00:26:37,640
Really, really hard and I get it
because it, you know, that's
545
00:26:37,760 --> 00:26:41,680
ultimately why you don't have
everybody being, having their
546
00:26:41,680 --> 00:26:44,800
own company and doing it.
And it wouldn't work if
547
00:26:44,800 --> 00:26:47,000
everybody did as well because,
you know, there'd just be lots
548
00:26:47,200 --> 00:26:48,800
of companies with one, one
person in them.
549
00:26:49,720 --> 00:26:52,360
But equally, that's why I think
it's a good stepping stone.
550
00:26:52,360 --> 00:26:54,400
And that can apply even if
you're not starting a company
551
00:26:54,400 --> 00:26:56,960
and you're just moving a job,
whatever or you start a new
552
00:26:56,960 --> 00:26:59,200
career.
I think it's a similar sort of
553
00:26:59,200 --> 00:27:00,840
thing.
The risk I don't think is as big
554
00:27:00,840 --> 00:27:05,800
as what you perceive it to be.
Secondly, I would just probably
555
00:27:05,800 --> 00:27:10,480
say you you have got to probably
genuinely spend the time outside
556
00:27:10,480 --> 00:27:13,800
of your day-to-day a little bit
extra time to explore.
557
00:27:13,880 --> 00:27:16,320
And it can be anything.
It can be take that course on
558
00:27:16,400 --> 00:27:20,920
Udemy or it can be, you know,
just sign up to that that event
559
00:27:20,960 --> 00:27:24,680
and go and attend it or it can
be read about a new project.
560
00:27:24,680 --> 00:27:27,840
I mean you could just start some
new software and start trying to
561
00:27:27,840 --> 00:27:30,520
use that.
I think if you just explore it,
562
00:27:30,520 --> 00:27:33,480
you learn a lot more, a lot
quicker about what's possible.
563
00:27:33,480 --> 00:27:36,200
And this is feedback into the AI
conversation.
564
00:27:36,200 --> 00:27:38,520
You're not going to get there
without actually just typing
565
00:27:38,520 --> 00:27:40,560
things in and seeing what it
does in many regards.
566
00:27:41,400 --> 00:27:46,280
And then lastly, I think on my
experience has always been, and
567
00:27:46,280 --> 00:27:49,600
this is very kind of founder
focused kind of knowledge, but
568
00:27:50,200 --> 00:27:55,000
just aim at trusting people and
employing people or having, you
569
00:27:55,000 --> 00:27:57,040
know, people in your business
who are better than you, who
570
00:27:57,040 --> 00:27:58,800
can.
Don't be afraid of being the the
571
00:27:58,800 --> 00:28:01,400
worst one in the room, the guys
who and girls that you've got
572
00:28:01,400 --> 00:28:04,600
who can be better at it because
as long as you get their
573
00:28:04,600 --> 00:28:08,680
incentives right and the reasons
why they're doing it, the whole
574
00:28:08,680 --> 00:28:10,080
business and everything you're
doing.
575
00:28:10,080 --> 00:28:12,760
And again, this is just me as a
founder book applies in
576
00:28:12,760 --> 00:28:15,400
everything else.
It's I wouldn't be afraid of the
577
00:28:15,400 --> 00:28:17,840
people who are smarter and I
would trust the people and just
578
00:28:17,920 --> 00:28:21,200
if you get everybody's sort of
direction kind of aligned, you
579
00:28:21,200 --> 00:28:23,280
can you achieve so much more
with the right people.
580
00:28:24,120 --> 00:28:25,520
So yeah, it's nice kind of
tidbits.
581
00:28:25,520 --> 00:28:27,360
I think of the founder
experience.
582
00:28:27,360 --> 00:28:30,240
Yeah, that that goes beyond
founder mentality as well as
583
00:28:30,240 --> 00:28:32,600
like you're eliminating the
fear, making sure you're
584
00:28:32,600 --> 00:28:35,040
professionally developing or
learning and exploring more,
585
00:28:35,040 --> 00:28:36,720
researching.
That's something that we should
586
00:28:36,720 --> 00:28:39,680
be doing naturally.
And then, you know, hire people
587
00:28:39,680 --> 00:28:42,520
smarter than you.
Yeah, trust them.
588
00:28:42,840 --> 00:28:45,400
Trust them to do the job.
Like, trust them and give them
589
00:28:45,400 --> 00:28:47,840
the tools and the support they
need to do the job.
590
00:28:47,880 --> 00:28:50,760
So absolutely, those are three
very great pieces of advice.
591
00:28:51,280 --> 00:28:52,440
Yeah.
One of my first kind of
592
00:28:52,440 --> 00:28:55,920
realizations of this, and again,
this is going back probably 20
593
00:28:55,920 --> 00:28:59,640
years or so, but was kind of our
first employee that we had at
594
00:28:59,640 --> 00:29:02,400
the business.
And I was used to being a
595
00:29:02,400 --> 00:29:05,480
freelance developer basically.
So I was worked for hire.
596
00:29:05,480 --> 00:29:08,040
I was paid per per hour, which
always changes your mentality,
597
00:29:08,040 --> 00:29:09,520
by the way.
I would, I would advise a lot of
598
00:29:09,520 --> 00:29:12,240
people if they could work per
hour for the money they work.
599
00:29:12,240 --> 00:29:15,000
It changes how you perceive your
working time and you, you
600
00:29:15,000 --> 00:29:17,720
deliver things differently.
It's very strange, but I went
601
00:29:17,720 --> 00:29:23,120
away on vacation and, and we
were still developing pieces and
602
00:29:23,240 --> 00:29:25,920
shipping that to our customers
and raising invoices and
603
00:29:25,920 --> 00:29:28,160
receiving money in.
And I thought, hang on, this is
604
00:29:28,640 --> 00:29:30,080
fantastic.
I'd only been used to knowing
605
00:29:30,080 --> 00:29:32,840
what I knew and what I could do.
And then suddenly you trust the
606
00:29:32,840 --> 00:29:35,400
people and you can go away and
say, well, I think we'll deliver
607
00:29:35,400 --> 00:29:37,680
that.
And you suddenly realise there's
608
00:29:37,680 --> 00:29:41,840
a lot more scope to, to do a lot
more with, you know, by just
609
00:29:41,880 --> 00:29:43,400
just kind of getting the right
people in place.
610
00:29:43,400 --> 00:29:45,640
So that was one of my first
realisations of it.
611
00:29:45,640 --> 00:29:48,280
And that helped me kind of
perceive and take the risk
612
00:29:48,280 --> 00:29:50,680
because evidence is a risk
realistically, but especially so
613
00:29:50,680 --> 00:29:52,920
when you're in hiring, in
business, it's there's a lot of
614
00:29:52,920 --> 00:29:55,920
people's lives on the line that
you will look after and try and
615
00:29:55,920 --> 00:29:58,920
pay for.
So yeah, I think that's always
616
00:29:58,920 --> 00:30:01,400
pushed me forward though to take
the risk because you kind of
617
00:30:01,400 --> 00:30:04,160
know if you get the right people
and kind of kind of incentivize
618
00:30:04,160 --> 00:30:06,920
and right, then you can get some
great, great work and kind of
619
00:30:06,920 --> 00:30:08,720
like and enjoy.
A lot as a founder, you can kind
620
00:30:08,720 --> 00:30:12,080
of see things in the future if
you put the right pieces into
621
00:30:12,080 --> 00:30:14,720
place.
Yeah, I think so.
622
00:30:14,720 --> 00:30:17,880
There's a little bit about, I
think everything is a risk, sort
623
00:30:17,880 --> 00:30:19,680
of every decision you make, I
think.
624
00:30:19,760 --> 00:30:22,760
Nothing's a guarantee.
Nothing's a guarantee, but I, I
625
00:30:22,760 --> 00:30:25,160
think if it's like that building
block approach, if you kind of
626
00:30:25,160 --> 00:30:27,360
come at it with, well, that kind
of makes sense and we'll do that
627
00:30:27,360 --> 00:30:29,440
and that makes sense.
You'll look back in a few years
628
00:30:29,440 --> 00:30:32,200
and sort of everything will have
been based upon not one
629
00:30:32,200 --> 00:30:33,560
decision.
It'll be based upon the
630
00:30:33,560 --> 00:30:35,440
combination of the, the hundreds
you've made.
631
00:30:36,080 --> 00:30:38,800
So you can't really get too
concerned if one's wrong, as
632
00:30:38,800 --> 00:30:42,320
long as you're doing kind of the
right things generally.
633
00:30:43,600 --> 00:30:46,520
Absolutely, Fred.
We had so much fun.
634
00:30:46,520 --> 00:30:47,960
We're at the end of the episode
now.
635
00:30:48,080 --> 00:30:51,000
So fast.
Yeah, it it goes, it does go
636
00:30:51,000 --> 00:30:52,360
really fast when you're having
fun.
637
00:30:52,720 --> 00:30:54,480
So tell people where they can
find you.
638
00:30:54,480 --> 00:30:56,880
We're obviously going to include
everything in the show notes
639
00:30:57,000 --> 00:30:59,120
about thirst, about you, where
to connect.
640
00:30:59,120 --> 00:31:00,280
So tell us where we can find
you.
641
00:31:00,280 --> 00:31:03,200
What's the best way for people
to connect with you or learn
642
00:31:03,200 --> 00:31:05,640
more about thirst?
Yeah, no problem.
643
00:31:05,640 --> 00:31:07,520
Generally, LinkedIn is always
the place where we put a point.
644
00:31:07,520 --> 00:31:09,720
Somebody too, now, doesn't it?
But so I'm on LinkedIn.
645
00:31:10,040 --> 00:31:13,520
Fred Thompson You should be able
to find a relatively simple URL,
646
00:31:13,520 --> 00:31:16,520
but we'll post it in the show
notes and then first is first
647
00:31:16,520 --> 00:31:19,160
dot IO as well if you want to
find out more about the the
648
00:31:19,160 --> 00:31:21,920
platform that we produce.
Fantastic.
649
00:31:21,920 --> 00:31:24,000
Well, thank you so much for
coming on the show.
650
00:31:24,760 --> 00:31:28,320
Really appreciated to hear your
perspective and I love the
651
00:31:28,600 --> 00:31:31,960
reference you made to the
American culture about how we
652
00:31:31,960 --> 00:31:35,360
boast about our.
I'm one of the more reserved
653
00:31:35,360 --> 00:31:36,400
ones.
I don't boast about it.
654
00:31:36,400 --> 00:31:38,720
I feel very like weird when
people say like you're.
655
00:31:38,720 --> 00:31:40,400
An.
Influencer you're this, you're
656
00:31:40,480 --> 00:31:41,760
that.
And I'm like, no, I'm not.
657
00:31:41,800 --> 00:31:43,960
I'm just a person.
I love it.
658
00:31:43,960 --> 00:31:46,600
I just wish kind of, you know,
as a, as a nationality, as a
659
00:31:46,600 --> 00:31:48,160
nation, we would embrace it a
little bit more.
660
00:31:48,160 --> 00:31:48,720
We're not too.
Good.
661
00:31:48,720 --> 00:31:49,960
Yeah, absolutely.
Yeah.
662
00:31:50,920 --> 00:31:52,640
Well, we'll see what happens in
the future.
663
00:31:52,640 --> 00:31:56,040
So thank you so much.
And I can't wait for people to
664
00:31:56,040 --> 00:32:00,720
hear this episode.
Thanks for spending a few
665
00:32:00,720 --> 00:32:03,040
minutes with Holly.
She knows your podcast queue is
666
00:32:03,040 --> 00:32:05,720
packed.
If today's episode sparked an
667
00:32:05,720 --> 00:32:09,920
idea or gave you that extra
nudge of confidence, tap, follow
668
00:32:10,160 --> 00:32:13,600
or subscribe in your favorite
app so you never miss an episode
669
00:32:13,600 --> 00:32:16,880
of Ed Up L&D.
Dropping a quick rating or
670
00:32:16,880 --> 00:32:20,360
review helps more educators and
learning pros discover the show,
671
00:32:20,360 --> 00:32:22,320
too.
Want to keep the conversation
672
00:32:22,320 --> 00:32:24,640
going?
Connect with Holly on LinkedIn
673
00:32:24,720 --> 00:32:26,200
and share your biggest take
away.
674
00:32:26,480 --> 00:32:29,760
She reads every message.
Until next time, keep learning,
675
00:32:29,840 --> 00:32:32,400
keep leading, and keep believing
in your own story.
676
00:32:32,920 --> 00:32:36,560
Talk soon.
Hi, we're Ice Spring, an
677
00:32:36,560 --> 00:32:39,480
international team of e-learning
enthusiasts who help more than
678
00:32:39,480 --> 00:32:43,000
60,000 clients across the globe
succeed with better online
679
00:32:43,000 --> 00:32:45,480
learning.
Our two flagship solutions are
680
00:32:45,480 --> 00:32:48,160
Ice Spring Suite and Ice Spring
Learn LMS.
681
00:32:48,560 --> 00:33:00,920
Ice Spring Suite is an intuitive
all in We'd be happy to get to
682
00:33:00,920 --> 00:33:03,240
know you and pick a solution
that fits your needs best.
683
00:33:03,560 --> 00:33:07,720
Go to www.icepringsolutions.com
to learn more about us and
684
00:33:07,720 --> 00:33:08,320
connect.

Fred Thompson
Founder, Thirst
Fred Thompson is CEO of Thirst and is a seasoned learning and development expert with over two decades of experience helping businesses create engaging learning environments that level up learner engagement.
Fred has spearheaded numerous initiatives that have empowered employees to learn and develop in the most intuitive way possible. His insights into the latest learning technologies and trends have been invaluable to countless organisations looking to turbocharge their learning cultures with game-changing learning platforms.
Fred has a passion for using AI and machine learning to create social learning experiences that meet the unique needs of each employee and help them enhance their skills and grow their careers.