The player is loading ...
166: The Future Isn’t Flashier—It’s Smarter: Christy Tucker on Building Learning That Sticks

In this episode of EdUp L&D, Holly Owens welcomes back Christy Tucker, a seasoned expert in instructional design and scenario-based learning. They discuss Christy's journey from K-12 education to corporate training, the nuances of scenario-based learning, common pitfalls in instructional design, and the evolving role of AI in the field. Christy shares valuable insights on how to effectively implement branching scenarios and the importance of aligning practice with real-world skills. The conversation wraps up with Christy's thoughts on the future of instructional design and where to find her resources.


Resources mentioned in this episode:


Guest Contact Information:

___________________________________

Episode Sponsor: iSpring Solutions

🎙️ Huge thanks to our friends at iSpring Solutions for sponsoring this episode of the EdUp L&D podcast! 🙌

If you haven’t already, be sure to check out the ⁠⁠⁠⁠iSpring Learning Exchange Community⁠⁠⁠⁠ — a vibrant space for creators, educators, and L&D pros to connect and grow.

And don’t miss your chance to join the ⁠⁠⁠⁠iSpring Course Creation Contest⁠⁠⁠⁠ by July 25! 🚀

Grateful for the support and excited to see what our community creates 💡



Thanks for tuning in! 🎧

 

1
00:00:01,440 --> 00:00:05,000
Hey friends, and welcome back to
another incredible episode of

2
00:00:05,040 --> 00:00:08,400
Add Up LND.
I'm your host, Holly Owens, and

3
00:00:08,400 --> 00:00:13,160
today I'm beyond excited because
we have a returning guest who is

4
00:00:13,160 --> 00:00:15,600
basically a legend in the LND
world.

5
00:00:15,800 --> 00:00:17,760
Christy Tucker is back on the
mic.

6
00:00:18,480 --> 00:00:21,920
If you're not already following
her on LinkedIn or reading her

7
00:00:21,920 --> 00:00:26,840
blog, what are you doing?
Christy brings deep insight,

8
00:00:27,000 --> 00:00:30,480
clarity, and a whole lot of real
talk when it comes to

9
00:00:30,480 --> 00:00:34,360
instructional design,
freelancing, and all things

10
00:00:34,360 --> 00:00:37,880
scenario based learning.
In this episode, we nerd out

11
00:00:37,880 --> 00:00:42,040
hard on what it makes what it
makes good learning stick,

12
00:00:42,600 --> 00:00:47,480
especially branching scenarios,
simulations, and how to actually

13
00:00:47,480 --> 00:00:50,080
apply these tools in the real
world.

14
00:00:50,840 --> 00:00:54,800
Plus, we dive into how AI has
evolved in our workflows over

15
00:00:54,800 --> 00:00:59,240
the last year, the ethical
nuances around image generation,

16
00:00:59,520 --> 00:01:03,680
and where this technology might
take our field next.

17
00:01:04,120 --> 00:01:07,920
So buckle up, this is one of
those jam packed conversations

18
00:01:08,120 --> 00:01:10,120
that's going to get your wheels
turned.

19
00:01:10,680 --> 00:01:12,920
Let's dive in with Christy
Tucker.

20
00:01:14,800 --> 00:01:18,440
Hi, we're ispring, an
international team of e-learning

21
00:01:18,440 --> 00:01:22,600
enthusiasts who help more than
60,000 clients across the globe

22
00:01:22,600 --> 00:01:25,320
succeed with better online
learning.

23
00:01:26,040 --> 00:01:30,240
Our two flagship solutions are
ispring Suite and ispring Learn

24
00:01:30,280 --> 00:01:33,680
LMS.
Ispring Suite is an intuitive,

25
00:01:33,760 --> 00:01:37,240
all in one authoring tool for
creating engaging e-learning

26
00:01:37,240 --> 00:01:41,160
content, and ispringlearn is an
innovative online training

27
00:01:41,160 --> 00:01:44,880
platform for onboarding,
upskilling, and certifying your

28
00:01:44,880 --> 00:01:47,560
teams.
We also provide tons of free

29
00:01:47,560 --> 00:01:50,400
resources for aspiring and
experienced e-learning

30
00:01:50,400 --> 00:01:53,720
professionals, conduct weekly
webinars with top industry

31
00:01:53,720 --> 00:01:57,440
experts, and organize annual
e-learning conferences,

32
00:01:57,520 --> 00:02:02,000
challenges, and championships.
We'd be happy to get to know you

33
00:02:02,120 --> 00:02:04,760
and pick a solution that fits
your needs best.

34
00:02:05,160 --> 00:02:10,759
Go to www.icepringsolutions.com
to learn more about us, download

35
00:02:10,759 --> 00:02:16,840
our resources, and connect.
Hello everyone, and welcome to

36
00:02:16,840 --> 00:02:19,960
another amazing episode of Add
Up L&D.

37
00:02:20,240 --> 00:02:23,880
I'm super pumped today because
we have a returning guest and

38
00:02:23,880 --> 00:02:25,960
you're going to love her.
If you don't already love her,

39
00:02:25,960 --> 00:02:29,880
follow around LinkedIn and all
of her great resources and tips.

40
00:02:30,200 --> 00:02:33,480
Christy Tucker is here.
Welcome to the show, Christy.

41
00:02:33,480 --> 00:02:35,080
Welcome back to the show,
Christy.

42
00:02:35,760 --> 00:02:37,920
It's so good to have a chance to
chat with you again, Holly.

43
00:02:37,920 --> 00:02:40,960
We had so much fun last time.
We did and I'm, I'm looking

44
00:02:40,960 --> 00:02:43,600
forward to asking that question
we were talking about earlier

45
00:02:43,600 --> 00:02:46,720
cuz it's, we kind of had, it was
a timely episode.

46
00:02:46,720 --> 00:02:50,800
The last episode it was.
But but before we dive into all

47
00:02:50,800 --> 00:02:54,280
that, for those who don't know
you, and they should know you

48
00:02:54,920 --> 00:02:57,920
out on the LinkedIn space
especially, tell us about your

49
00:02:57,920 --> 00:03:01,120
journey into L&D, what you do
like, what's your area of

50
00:03:01,120 --> 00:03:04,640
expertise, all the different
things you've been up to and how

51
00:03:04,640 --> 00:03:07,800
you've grown into this area that
you're in now.

52
00:03:08,360 --> 00:03:12,920
Yeah, so my career has always
been about helping people learn

53
00:03:12,920 --> 00:03:15,320
in one way or another.
So I've been helping people

54
00:03:15,320 --> 00:03:20,280
learn for over 20 years.
I started as AK12 music band

55
00:03:20,280 --> 00:03:23,000
teacher.
Like most of us, not a band.

56
00:03:23,040 --> 00:03:25,640
Teacher many of us started as
AK12 teacher.

57
00:03:25,640 --> 00:03:27,160
That's that's where I started
to.

58
00:03:28,200 --> 00:03:33,920
And then I switched to corporate
software training back when lots

59
00:03:33,920 --> 00:03:38,840
of businesses had computer labs
where you would go in person and

60
00:03:38,840 --> 00:03:42,440
sit in a computer room and have
a person stand up in the front

61
00:03:42,440 --> 00:03:44,440
of the room and teach you how to
use Microsoft Office.

62
00:03:44,560 --> 00:03:47,680
Yeah, I did that job.
It is why I know all sorts of

63
00:03:47,680 --> 00:03:50,800
obscure, you know, Microsoft
Office, Microsoft Excel

64
00:03:51,680 --> 00:03:53,440
shortcuts and.
Things we're gonna have to get

65
00:03:53,440 --> 00:03:56,080
like a random fact in the
episode about Microsoft

66
00:03:56,080 --> 00:03:58,520
something.
I know we, we, we did.

67
00:03:58,520 --> 00:04:02,520
So I did all of that, like the
train people how to use train

68
00:04:02,520 --> 00:04:06,000
people how to use Office and
Microsoft Project and Access and

69
00:04:06,000 --> 00:04:09,160
a little bit of relational
relational database design.

70
00:04:09,200 --> 00:04:12,960
And that was fun.
But I missed the creating the

71
00:04:12,960 --> 00:04:14,920
curriculum side of things from
teaching.

72
00:04:15,360 --> 00:04:17,360
I'd been like teaching out of
books that other people were

73
00:04:17,360 --> 00:04:21,079
writing up and I that was fine,
but, and I liked working with

74
00:04:21,079 --> 00:04:23,920
adult learners, but I missed the
writing side of things.

75
00:04:23,920 --> 00:04:26,200
And so I did the research of
like, well, what's, what's a job

76
00:04:26,200 --> 00:04:29,920
where I do that and found
instructional design and then

77
00:04:31,360 --> 00:04:34,040
spent a long time applying for
jobs before I got the first

78
00:04:34,040 --> 00:04:38,720
instructional design job.
But I have now been working in

79
00:04:38,720 --> 00:04:42,760
instructional design for over 20
years and so.

80
00:04:43,560 --> 00:04:46,600
It goes by fast, doesn't it?
Yeah, I've done.

81
00:04:46,600 --> 00:04:51,800
I did started out as work for an
online university initially, but

82
00:04:51,840 --> 00:04:57,000
I've done kind of both higher Ed
and workplace training things.

83
00:04:57,000 --> 00:05:01,320
I've worked with nonprofits and
government agencies, everything

84
00:05:01,320 --> 00:05:04,440
from cities up to federal
government agencies.

85
00:05:04,440 --> 00:05:07,400
I've done lots of things with
associations and professional

86
00:05:07,400 --> 00:05:10,160
development for people through
associations.

87
00:05:10,160 --> 00:05:13,000
And I started my own business in
2011.

88
00:05:13,000 --> 00:05:17,160
So I've been now working for
myself for more than 10 years.

89
00:05:17,600 --> 00:05:20,800
I am a terrible yes man.
And so I'm not actually very

90
00:05:20,880 --> 00:05:23,600
good working for a boss.
It's much better when I'm the

91
00:05:23,600 --> 00:05:26,320
external consultant and then I
get to tell people like, yeah,

92
00:05:26,320 --> 00:05:30,040
this is a dumb way to do things.
I mean I'm I'm usually better,

93
00:05:30,040 --> 00:05:33,720
more diplomatic.
But we have a, we have, we have

94
00:05:33,720 --> 00:05:35,720
a much better way that you can.
Right, right.

95
00:05:35,960 --> 00:05:39,560
It's always, it's always, it's
always on the like, well, you

96
00:05:39,560 --> 00:05:41,600
know, I can see where you're
going with this.

97
00:05:41,600 --> 00:05:44,520
But given the goal that you've
said that you had, do you think

98
00:05:44,520 --> 00:05:46,920
that would you be open to
considering another approach?

99
00:05:46,960 --> 00:05:48,960
Yeah, So like people.
There you go.

100
00:05:49,200 --> 00:05:51,120
Like the way that you did it is
dumb.

101
00:05:51,760 --> 00:05:54,480
Yeah.
But it is sometimes it is like

102
00:05:54,480 --> 00:05:57,560
you have to be pretty blunt,
like this is not going to work.

103
00:05:57,720 --> 00:06:00,240
It is really not going.
To work and also sometimes it is

104
00:06:00,240 --> 00:06:04,800
the I've made my recommendation,
but ultimately you're the

105
00:06:04,800 --> 00:06:09,000
customer and if you choose to go
ahead with this, like understand

106
00:06:09,400 --> 00:06:14,240
that you know you're taking
responsibility for the outcomes.

107
00:06:14,400 --> 00:06:16,720
That's going to be your the
You're responsible for the

108
00:06:16,720 --> 00:06:20,280
consequences of what happens.
And it is these Sometimes you

109
00:06:20,280 --> 00:06:22,960
let clients have the natural
consequences of their decisions.

110
00:06:23,160 --> 00:06:25,160
Yeah.
And sometimes that also is your

111
00:06:25,160 --> 00:06:28,560
job as a consultant.
So I've been doing that these

112
00:06:28,560 --> 00:06:32,200
days, I specialize particularly
in scenario based learning and

113
00:06:32,200 --> 00:06:35,320
branching scenarios.
I've been blogging.

114
00:06:35,320 --> 00:06:37,800
I'll hit, let's see.
I'm, I'm, I'm not, I'm not at

115
00:06:37,800 --> 00:06:42,240
the 20 years on blogging, but
it's been a long time and I do

116
00:06:43,080 --> 00:06:45,720
right in my blog about scenario
based learning and a lot about

117
00:06:45,880 --> 00:06:48,920
AI images and, and using AI
these days too.

118
00:06:48,960 --> 00:06:52,840
So those are sort of the things
that I'm known for.

119
00:06:53,000 --> 00:06:55,080
Yeah.
So you know, for the audience

120
00:06:55,080 --> 00:06:57,520
that doesn't know, you're
definitely an expert in the

121
00:06:57,520 --> 00:06:59,320
scenario based learning as
you're saying.

122
00:06:59,600 --> 00:07:01,360
So tell them what that actually
is.

123
00:07:01,360 --> 00:07:04,640
Now we have some, we definitely
have a majority of our audiences

124
00:07:04,640 --> 00:07:06,320
transitioning teachers like
ourselves.

125
00:07:06,320 --> 00:07:09,360
So they might not understand,
you know the IDs and that are

126
00:07:09,360 --> 00:07:11,080
listening, they definitely have
an idea.

127
00:07:11,080 --> 00:07:12,640
But what is scenario based
learning?

128
00:07:13,520 --> 00:07:16,000
And you know, I think I actually
wouldn't necessarily seem that

129
00:07:16,000 --> 00:07:17,640
everybody who's an instruction
designer does.

130
00:07:17,640 --> 00:07:20,720
Yeah, that's true.
Because we are so terrible in

131
00:07:20,720 --> 00:07:23,440
this field about having
consistent terminology, scenario

132
00:07:23,440 --> 00:07:27,640
based learning is not maybe
quite as bad as like micro

133
00:07:27,640 --> 00:07:30,920
learning in terms of David Kelly
once said that there are no

134
00:07:30,920 --> 00:07:33,800
definitions of micro learning,
there are just opinions, some of

135
00:07:33,800 --> 00:07:35,520
which are labeled as
definitions.

136
00:07:35,720 --> 00:07:38,160
Is that the same for scenario?
Based and scenario based

137
00:07:38,160 --> 00:07:41,400
learning has there is some of
that too of the I will say I

138
00:07:41,400 --> 00:07:44,560
tend, I personally tend to be
pretty broad in my definition of

139
00:07:44,560 --> 00:07:47,720
scenario based learning that I
do tend to think that any sort

140
00:07:47,720 --> 00:07:51,520
of learning that is using
scenarios as part of the

141
00:07:51,920 --> 00:07:55,680
training, as part of the
learning experience can go under

142
00:07:55,680 --> 00:07:58,080
that broad umbrella of scenario
based learning.

143
00:07:58,720 --> 00:08:02,120
Branching scenarios are one form
of that, and they tend to be the

144
00:08:02,120 --> 00:08:04,720
one that people think of, but I
would actually put that as a

145
00:08:04,720 --> 00:08:07,920
smaller subset within a broader.
That's the one, I think.

146
00:08:08,880 --> 00:08:13,800
Of right So branching scenarios
are essentially like the choose

147
00:08:13,800 --> 00:08:18,920
your own adventure story for
training so it is that you have

148
00:08:18,920 --> 00:08:23,080
a situation you have a couple of
choices you make a choice and

149
00:08:23,080 --> 00:08:26,600
then what happens next is
different depending on what your

150
00:08:26,600 --> 00:08:31,400
choice is and so it branches out
into multiple different paths

151
00:08:31,400 --> 00:08:35,039
and there are multiple different
endings positive and negative

152
00:08:35,039 --> 00:08:38,640
and you get different results
you see the consequences of your

153
00:08:38,640 --> 00:08:42,200
decisions and I love.
Before you're actually doing it

154
00:08:42,200 --> 00:08:43,799
in real life.
Right before.

155
00:08:43,799 --> 00:08:46,160
You're which is important.
Right, right.

156
00:08:46,160 --> 00:08:49,600
And so it's a key thing for when
you've got things like the soft

157
00:08:49,600 --> 00:08:52,000
skills, when you've got
communication skills, when

158
00:08:52,000 --> 00:08:54,880
you've got strategic skills,
when you've got things where

159
00:08:54,880 --> 00:08:59,360
there's Gray area and where
you're working on practicing

160
00:08:59,360 --> 00:09:03,080
decision making.
Branching scenarios are one of

161
00:09:03,080 --> 00:09:07,160
the most effective ways to train
that.

162
00:09:07,920 --> 00:09:11,640
One thing that we fail a lot in,
in workplace training in

163
00:09:11,640 --> 00:09:15,200
particular, and and also in
education is giving people

164
00:09:15,200 --> 00:09:19,600
enough practice with feedback,
doing the actual thing.

165
00:09:20,320 --> 00:09:24,680
And the closer you can get the
practice to look like the real

166
00:09:24,680 --> 00:09:27,200
skill the people need to do, the
better off it is.

167
00:09:27,920 --> 00:09:30,680
And so even though in a
branching scenario you might

168
00:09:30,680 --> 00:09:34,440
have just text or maybe you have
text and static images, you can

169
00:09:34,440 --> 00:09:38,840
still make it cognitively more
similar to the kinds of skills

170
00:09:38,840 --> 00:09:41,800
that people need to practice.
You can have them practice

171
00:09:41,800 --> 00:09:46,240
decision making, which is key
for a whole lot of different

172
00:09:46,240 --> 00:09:48,120
things.
So you know, if it's something

173
00:09:48,120 --> 00:09:51,600
procedural where you just follow
the same steps every time and

174
00:09:51,600 --> 00:09:53,480
it's exactly the same every time
you do it.

175
00:09:53,480 --> 00:09:55,800
And if you give five different
people the thing, you're going

176
00:09:55,800 --> 00:09:58,640
to get the same results all five
times as long as they follow the

177
00:09:58,640 --> 00:10:00,560
procedure.
Don't use a brand new scenario

178
00:10:00,560 --> 00:10:01,600
for that.
It's overkill.

179
00:10:01,600 --> 00:10:02,640
It's dumb, you'll waste your
time.

180
00:10:04,720 --> 00:10:07,960
But well said.
Right like now, maybe you need a

181
00:10:07,960 --> 00:10:09,920
little.
Maybe you do need some sort of

182
00:10:09,920 --> 00:10:11,960
simulation of the process, like
there's.

183
00:10:11,960 --> 00:10:14,560
Yeah, I was gonna ask cuz that's
one of the things we use a lot

184
00:10:14,560 --> 00:10:20,280
at Amazon as simulations in.
Privacy in a simulation that

185
00:10:20,280 --> 00:10:22,640
walks you through.
But if it is, if it is just if

186
00:10:22,640 --> 00:10:25,400
it is something where there's a
checklist and you really are

187
00:10:25,400 --> 00:10:26,800
doing it the same way every
time.

188
00:10:27,160 --> 00:10:29,480
This is the difference between
procedural skills and strategic

189
00:10:29,480 --> 00:10:31,680
skills.
Ruth Clark talks about this in

190
00:10:31,680 --> 00:10:35,680
her book on scenario based
e-learning where she talks about

191
00:10:35,680 --> 00:10:39,120
like when do you use scenarios
and when do you should you use?

192
00:10:39,120 --> 00:10:43,000
When is some other approach,
probably better procedural is

193
00:10:43,320 --> 00:10:45,880
you give five people the task
and you get and as long as

194
00:10:45,880 --> 00:10:47,920
everybody does it right, you get
the same result five times.

195
00:10:49,560 --> 00:10:53,000
Strategic tasks are the ones
where if you give five people

196
00:10:53,000 --> 00:10:56,480
the brief and you're going to
get 5 different results, all of

197
00:10:56,480 --> 00:11:00,880
which may be successful,
although in different ways.

198
00:11:00,880 --> 00:11:05,400
So if it is building a website
for your business, you give that

199
00:11:05,400 --> 00:11:07,800
brief to five different web
developers, you are going to get

200
00:11:08,160 --> 00:11:13,600
very different results and they
all could be successful

201
00:11:15,120 --> 00:11:17,600
depending on, but they're
probably going to, you know,

202
00:11:17,600 --> 00:11:20,320
weigh certain factors higher or
lower.

203
00:11:20,320 --> 00:11:23,960
Visual design is not, there's
not one way to solve that.

204
00:11:24,960 --> 00:11:28,440
Writing things, writing,
copywriting, learning, there's

205
00:11:28,440 --> 00:11:33,240
not one right answer that is
automatically better than

206
00:11:33,240 --> 00:11:37,400
everything else.
Those are strategic skills and

207
00:11:37,400 --> 00:11:40,680
those in particular are really
hard to practice in a lot of the

208
00:11:40,680 --> 00:11:45,080
traditional ways that we do
assessment and practice where we

209
00:11:45,080 --> 00:11:49,120
have forced choice, branching
scenarios do still have a forced

210
00:11:49,120 --> 00:11:52,880
choice in the typical way that
we're doing things, but it is a

211
00:11:52,880 --> 00:11:55,120
forced choice within a realistic
context.

212
00:11:55,360 --> 00:11:58,440
And you're choosing to do
something not just to talk

213
00:11:58,440 --> 00:12:01,520
about, you know, categorizing.
Well, what type of question

214
00:12:02,080 --> 00:12:04,280
would you do?
It's the difference between

215
00:12:04,280 --> 00:12:06,320
you're a manager and you're
trying to resolve a conflict

216
00:12:06,320 --> 00:12:11,400
between two of your employees
asking the question, what type

217
00:12:11,400 --> 00:12:15,320
of de escalation strategy should
you use in order to make sure

218
00:12:15,320 --> 00:12:18,880
that both participants, both
employees feel heard, right?

219
00:12:19,200 --> 00:12:22,520
That's categorization question
and it's abstracted, but it's

220
00:12:22,520 --> 00:12:26,800
not the same as deciding, you
know, Rita and Oliver are

221
00:12:26,800 --> 00:12:29,400
arguing.
Here's what they said.

222
00:12:29,800 --> 00:12:33,520
What do you do next?
If you immediately take one

223
00:12:33,520 --> 00:12:37,800
employee's side and say, OK,
Oliver, I think you've got a

224
00:12:37,800 --> 00:12:39,840
good point here, Rita, I think
we should just go with Oliver's

225
00:12:39,840 --> 00:12:41,600
plan.
And you cut that off without

226
00:12:41,600 --> 00:12:44,840
listening, you're going to get a
different result than if you

227
00:12:44,840 --> 00:12:49,000
say, OK, let's sit down and have
a conversation about this.

228
00:12:49,000 --> 00:12:52,880
Or if you go talk to each one of
them privately first and then

229
00:12:52,880 --> 00:12:55,080
you try to get them together,
maybe you get some other

230
00:12:55,080 --> 00:12:57,920
background information so you
come in more prepared for that

231
00:12:57,920 --> 00:12:59,360
conversation with the two of
them.

232
00:12:59,840 --> 00:13:02,360
For sure.
Yeah, that's, that's common.

233
00:13:02,360 --> 00:13:06,000
That happens across industries.
Those sorts of conversations are

234
00:13:06,000 --> 00:13:07,840
the sorts of conflicts that you
have to resolve.

235
00:13:07,840 --> 00:13:11,160
It's not just strictly related
to like anything that we're in.

236
00:13:11,200 --> 00:13:13,400
You're kind of, you're already
mentioning this, but you know,

237
00:13:13,400 --> 00:13:16,280
what are some of the most common
mistakes instructional designers

238
00:13:16,280 --> 00:13:18,600
make when they create these
branching scenarios you're

239
00:13:18,600 --> 00:13:20,160
talking about?
Like they don't like if it's

240
00:13:20,160 --> 00:13:23,320
just one workflow, you don't
need a branching scenario or a

241
00:13:23,320 --> 00:13:28,040
simulation necessary for that.
And like, how can, how can we

242
00:13:28,040 --> 00:13:30,200
fix them?
So if they've, I know one of the

243
00:13:30,200 --> 00:13:33,280
things when I worked at
pharmacy, we have had so many E

244
00:13:33,280 --> 00:13:36,640
learnings that we had from one
boarding till, you know,

245
00:13:36,640 --> 00:13:39,720
retraining them, but we had to
go back and redo everything.

246
00:13:39,720 --> 00:13:43,640
So how when they get to this
point where they're taking a

247
00:13:43,640 --> 00:13:46,320
different approach, maybe
they're updating, How do you,

248
00:13:46,480 --> 00:13:49,360
how can they fix that stuff?
What can they do?

249
00:13:50,240 --> 00:13:53,040
So I think there's there's two
things in terms of like getting

250
00:13:53,040 --> 00:13:54,560
started.
One option is, you know, I

251
00:13:54,560 --> 00:13:57,800
talked about how I do tend to
take the broader view on

252
00:13:57,800 --> 00:14:01,160
scenario based learning.
Besides the branching scenarios.

253
00:14:01,280 --> 00:14:06,480
I also use a lot of one question
mini scenarios, set it up, ask

254
00:14:06,480 --> 00:14:11,200
one question and then give some
feedback because even if I

255
00:14:11,200 --> 00:14:14,640
cannot convince an organization
that they want to invest in a

256
00:14:14,640 --> 00:14:19,760
branching scenario, the one
question scenario does not take

257
00:14:19,760 --> 00:14:23,160
too much longer to write than a
traditional multiple choice

258
00:14:23,160 --> 00:14:26,000
question.
But it's more application, it's

259
00:14:26,000 --> 00:14:29,320
higher level thinking, it's a
better practice activity.

260
00:14:29,880 --> 00:14:33,480
It's a really good way to get
started and to start building

261
00:14:33,480 --> 00:14:37,520
your skills writing these little
scenarios and writing tight and

262
00:14:38,120 --> 00:14:44,640
writing choices that are actions
rather than the abstract or

263
00:14:44,640 --> 00:14:48,000
categorization kinds of
questions that we tend to do.

264
00:14:48,560 --> 00:14:51,800
So the recall stuff, yeah.
Yeah, the recall because so much

265
00:14:51,800 --> 00:14:55,000
of the time, right, we, we do an
e-learning, we give them content

266
00:14:55,000 --> 00:14:58,160
and then we say so, you know,
can you remember the thing we

267
00:14:58,160 --> 00:15:00,120
told you 5 minutes ago?
Like the definition of

268
00:15:00,120 --> 00:15:02,880
something.
Right, the death, yeah, which is

269
00:15:02,880 --> 00:15:07,560
the definition of these things.
Which order do these steps go

270
00:15:07,560 --> 00:15:10,120
in?
OK, like that is important, but

271
00:15:10,120 --> 00:15:15,240
also is it a more effective
practice to have them recognize

272
00:15:15,240 --> 00:15:18,760
the order in a multiple choice
question where you've just

273
00:15:18,920 --> 00:15:22,000
shuffled the things around for
three or four choices or is it a

274
00:15:22,000 --> 00:15:23,480
more?
And then when they retake it,

275
00:15:23,480 --> 00:15:25,560
it's reshuffled again.
Right, then it's reshuffled.

276
00:15:25,720 --> 00:15:31,120
Or is it more effective to give
them a scenario in which case

277
00:15:31,480 --> 00:15:33,240
they have to do the steps in
order?

278
00:15:33,760 --> 00:15:39,120
And I love as a plausible
distractor for scenarios to have

279
00:15:39,360 --> 00:15:44,080
the right decision at the wrong
time in the process because

280
00:15:44,080 --> 00:15:46,640
that's a really easy mistake.
Anytime you have something where

281
00:15:46,640 --> 00:15:49,840
there's a step of thing, it's
the, Oh yeah, I need to use, you

282
00:15:49,840 --> 00:15:53,280
know, thinking about
motivational interviewing, which

283
00:15:53,280 --> 00:15:55,880
is a technique in, in healthcare
and other things.

284
00:15:55,880 --> 00:15:58,560
It's it's having conversations
with people to encourage

285
00:15:58,560 --> 00:16:00,480
behavior change.
But you don't want to do the

286
00:16:00,480 --> 00:16:03,600
summary too early in the process
because it's the thing that you

287
00:16:03,600 --> 00:16:05,520
do at the closing.
Right.

288
00:16:06,040 --> 00:16:10,800
But if you jump into it too
fast, or if you should ask a

289
00:16:10,800 --> 00:16:15,760
question about you, you should
be making a suggestion to have a

290
00:16:15,760 --> 00:16:18,640
small behavior change, just some
small measurable behavior

291
00:16:18,640 --> 00:16:21,120
change.
But you can make that suggestion

292
00:16:21,400 --> 00:16:24,360
too early in the process, before
you've done enough of the sort

293
00:16:24,360 --> 00:16:29,400
of behavior change work before.
Yes, I understand what you're.

294
00:16:30,000 --> 00:16:31,080
Saying right.
And so it's.

295
00:16:31,880 --> 00:16:32,960
Yeah.
It's the wrong process.

296
00:16:33,040 --> 00:16:37,160
You're kind of giving them the
answer without making them go

297
00:16:37,160 --> 00:16:40,160
through.
I kind of related to like long

298
00:16:40,160 --> 00:16:44,040
division and short division.
Like you have learned the full

299
00:16:44,040 --> 00:16:49,360
process 1st and then you can
learn those shorter ways, yeah.

300
00:16:50,080 --> 00:16:54,120
If that's the if the skill you
need to train has that kind of

301
00:16:54,120 --> 00:16:58,680
complexity in it, then it's
really hard to train it if all

302
00:16:58,680 --> 00:17:01,000
you're doing is single multiple
choice questions.

303
00:17:01,520 --> 00:17:03,720
Exactly.
Say it louder for the people in

304
00:17:03,720 --> 00:17:06,440
the back.
There was just a conversation on

305
00:17:06,440 --> 00:17:10,079
the instruction design subreddit
about when do you use branching

306
00:17:10,079 --> 00:17:12,599
scenarios and how do you get the
ideas for when you should do it

307
00:17:12,599 --> 00:17:14,440
or not.
And somebody said in the real

308
00:17:14,440 --> 00:17:17,760
world, if I can get 80% there
with the one question mini

309
00:17:17,760 --> 00:17:19,720
scenario, should I be doing this
Well?

310
00:17:19,920 --> 00:17:22,800
If it only meets one of these
and I can kind of do it with a

311
00:17:22,800 --> 00:17:24,280
one question mini scenario,
should you?

312
00:17:24,640 --> 00:17:28,880
Well, part of the answer is you
do branching scenarios when the

313
00:17:28,880 --> 00:17:34,520
problem you're trying to solve
is worth the cost and effort of

314
00:17:34,520 --> 00:17:36,800
building a branching scenario.
Right.

315
00:17:37,600 --> 00:17:39,760
How painful is the problem that
you're trying to solve?

316
00:17:41,280 --> 00:17:43,960
How complex is the skill that
you're trying to train?

317
00:17:44,800 --> 00:17:46,440
Right.
And of course, data will

318
00:17:46,440 --> 00:17:49,280
influence some of that
conversation as well, bringing

319
00:17:49,280 --> 00:17:50,560
some of the data.
Yep.

320
00:17:50,600 --> 00:17:53,240
And so if you've got data and
some of the how painful it is,

321
00:17:53,240 --> 00:17:57,880
is also the if you're training
50 people in a small

322
00:17:57,880 --> 00:18:02,240
organization, the solution has
to be faster to build than if

323
00:18:02,240 --> 00:18:07,440
you are training 10,000 people.
Because something that is a

324
00:18:07,440 --> 00:18:11,600
small problem but magnified by
10,000 people is much more

325
00:18:11,600 --> 00:18:16,080
likely to be expensive enough to
be worth building a more complex

326
00:18:16,480 --> 00:18:18,680
scenario.
And that's part of the real

327
00:18:18,680 --> 00:18:21,480
world of figuring out what
approaches you use.

328
00:18:21,480 --> 00:18:24,640
As much as I'd love to say every
project I do is a like cool

329
00:18:24,640 --> 00:18:27,440
branching scenario and.
Like it's changed in the world.

330
00:18:27,640 --> 00:18:32,040
It's not and I do lots of one
question mini scenarios because

331
00:18:32,040 --> 00:18:34,720
sometimes that's what it is.
I I've done interactive video

332
00:18:34,720 --> 00:18:37,640
scenarios.
Actually higher actors have 1/2

333
00:18:37,640 --> 00:18:40,680
day shoot.
It's, it's totally overkill for

334
00:18:40,680 --> 00:18:42,160
this.
And I, I've worked with this

335
00:18:42,160 --> 00:18:48,080
great Snee who has this vast
collection of photos of real

336
00:18:48,080 --> 00:18:53,360
problems in the environment.
And so I love her so much

337
00:18:53,360 --> 00:18:55,520
because she's gone through and
like labeled them.

338
00:18:55,720 --> 00:18:58,480
She'll sort them into folders of
like which problem it is, and

339
00:18:58,480 --> 00:19:01,800
then she'll have bad and good
and has them labeled in the

340
00:19:01,800 --> 00:19:02,960
photos.
This is great.

341
00:19:03,120 --> 00:19:05,520
And so we did a lot of things.
So frankly, for that one, the

342
00:19:05,520 --> 00:19:10,120
scenarios were here's a photo
you're inspecting this site,

343
00:19:10,320 --> 00:19:13,600
here's what you see.
Here's the checklist that you're

344
00:19:13,600 --> 00:19:16,640
using for your inspection.
What problems do you check off

345
00:19:16,640 --> 00:19:19,440
which, which things meet it,
which things don't?

346
00:19:19,440 --> 00:19:23,400
Based on the photos, those are
much smaller, lightweight

347
00:19:23,400 --> 00:19:26,600
scenarios.
But the skill I need people to

348
00:19:26,600 --> 00:19:30,160
do in that case is to visually
look at something and recognize

349
00:19:30,160 --> 00:19:31,960
whether it's a problem that
needs to be addressed or not.

350
00:19:32,960 --> 00:19:36,960
And so I'm getting the practice
exercise to look as much like

351
00:19:37,360 --> 00:19:42,360
the real skill that they need to
do as I can do in a self-paced

352
00:19:42,360 --> 00:19:44,800
e-learning.
My gosh, I love this episode.

353
00:19:45,120 --> 00:19:47,600
And now I'm going to shift the
conversation because last time

354
00:19:47,600 --> 00:19:52,560
that we talked, AI was coming
out like ChatGPT, that first

355
00:19:52,560 --> 00:19:54,840
segment of this episode.
That is so much valuable

356
00:19:54,840 --> 00:19:56,400
information for instructional
designers.

357
00:19:56,400 --> 00:19:58,200
So they need to go re listen to
that part.

358
00:19:59,480 --> 00:20:03,560
But let's shift into AI.
We were nervous about what was

359
00:20:03,600 --> 00:20:06,960
AI gonna do to the industry,
instructional designers, L&D as

360
00:20:06,960 --> 00:20:09,320
a whole.
My perspective has definitely

361
00:20:09,320 --> 00:20:11,160
changed.
How's your perspective?

362
00:20:11,160 --> 00:20:14,640
What do you think about AI and
how it fits in the workflows and

363
00:20:14,640 --> 00:20:17,840
things today for us?
We were at that point where

364
00:20:17,840 --> 00:20:19,960
there was a lot of people
talking about is AI going to

365
00:20:19,960 --> 00:20:22,680
take, you know, like is the
instructional design field going

366
00:20:22,680 --> 00:20:26,440
to exist five years from now?
I will say that I think overall,

367
00:20:27,000 --> 00:20:29,360
I don't think that the field of
instructional design is going to

368
00:20:29,360 --> 00:20:31,840
disappear.
I do think that roles are going

369
00:20:31,840 --> 00:20:36,920
to change, and I think we don't
know what all of those changes

370
00:20:36,920 --> 00:20:38,320
will be.
Yes.

371
00:20:39,400 --> 00:20:43,400
There's this idea that we tend
to overestimate the impact of

372
00:20:43,400 --> 00:20:46,960
new technology in the short term
and underestimate the impact of

373
00:20:46,960 --> 00:20:49,000
it in the long term.
That is very true.

374
00:20:49,440 --> 00:20:55,360
So when we see the, the, the
arguments of like, oh, your

375
00:20:55,360 --> 00:20:58,160
whole job is going to change in
the next two to three years.

376
00:20:58,360 --> 00:21:02,400
And if you aren't on board with
AI right now doing using it

377
00:21:02,400 --> 00:21:05,000
every single day, somebody else
is going to take your job.

378
00:21:05,240 --> 00:21:09,280
I get really uncomfortable with
that level of hype and fear

379
00:21:09,280 --> 00:21:12,680
mongering.
The reality is big organizations

380
00:21:12,680 --> 00:21:16,680
do not move that fast.
No, especially higher education

381
00:21:17,040 --> 00:21:21,280
rate in academia.
On the other hand, I don't think

382
00:21:21,280 --> 00:21:24,040
you should put your head in sand
and ignore it.

383
00:21:24,240 --> 00:21:26,240
It's not going away.
No.

384
00:21:26,920 --> 00:21:29,360
And you do need to be paying
attention.

385
00:21:29,360 --> 00:21:32,720
One of the assets that we have
in the L&D field is that we do

386
00:21:32,720 --> 00:21:36,840
tend to be curious people.
We like learning new stuff.

387
00:21:36,840 --> 00:21:38,840
We're.
Super nerdy Christy, we

388
00:21:38,840 --> 00:21:41,640
establish this.
We will go explore super nerdy

389
00:21:41,640 --> 00:21:44,080
things.
As as established, we're super

390
00:21:44,080 --> 00:21:47,440
nerdy and enjoy learning for the
sake of learning in ways that,

391
00:21:47,520 --> 00:21:50,480
like other people, maybe don't
enjoy learning for the sake of

392
00:21:50,480 --> 00:21:52,360
learning quite as much as we all
do.

393
00:21:53,640 --> 00:21:56,240
You should be getting some of
that hands on practice.

394
00:21:56,240 --> 00:21:58,200
You should be trying and
experimenting with it.

395
00:21:58,440 --> 00:22:02,520
You should be figuring out where
it's helpful to you and where

396
00:22:02,520 --> 00:22:06,000
the limits are, the tools are.
And sometimes it is also that

397
00:22:06,000 --> 00:22:09,080
you try the tools for something
and say, I don't think this is

398
00:22:09,080 --> 00:22:11,560
quite working yet.
But as fast as the technology is

399
00:22:11,560 --> 00:22:15,000
changing, in six months or a
year it may be able to do the

400
00:22:15,000 --> 00:22:18,640
thing that you want to do.
For my own work, I do use

401
00:22:18,640 --> 00:22:22,120
ChatGPT and Claude for writing
and I definitely do use to help

402
00:22:22,120 --> 00:22:24,840
me get unstuck.
And I suck at writing titles for

403
00:22:24,840 --> 00:22:27,440
presentations.
We're twinning there because I

404
00:22:27,440 --> 00:22:30,080
used to spend hours trying to
think of creative titles.

405
00:22:30,160 --> 00:22:34,280
Oh my gosh, it's torture.
Yes, or, or the the other thing

406
00:22:34,280 --> 00:22:37,320
that it's really good at of the
like, oh, I have an acronym.

407
00:22:37,360 --> 00:22:41,080
We've actually kind of decided
on an acronym for this thing and

408
00:22:41,080 --> 00:22:43,680
we've got these concepts.
Help me come up with things that

409
00:22:43,680 --> 00:22:46,880
will go for each letter in this,
like give me three options for

410
00:22:46,880 --> 00:22:49,600
each letter in this acronym.
Or here's the topic.

411
00:22:49,600 --> 00:22:52,320
And like help me come up with
mnemonics to come up with to do

412
00:22:52,320 --> 00:22:54,680
this.
Cuz like how long does it take

413
00:22:54,680 --> 00:22:56,440
to come up with a really good
mnemonic?

414
00:22:56,640 --> 00:22:58,680
But brainstorming that with AI
is great.

415
00:22:59,120 --> 00:23:02,360
Yeah, 100%.
I use it a lot for image

416
00:23:02,400 --> 00:23:05,120
generation, again because my
work does tend to be a lot of

417
00:23:05,120 --> 00:23:08,880
scenarios.
With AII can I can go in with

418
00:23:08,880 --> 00:23:11,880
mid journey and I can generate
unique characters for every

419
00:23:11,880 --> 00:23:14,920
single project.
I can do different poses for

420
00:23:14,920 --> 00:23:17,240
them, I can change their outfit
and their setting.

421
00:23:18,560 --> 00:23:24,160
I love that.
A journey is my primary tool for

422
00:23:24,160 --> 00:23:27,160
images.
You spend $10 to get it for one

423
00:23:27,160 --> 00:23:28,640
month.
You can try it out and then you

424
00:23:28,640 --> 00:23:31,120
can actually cancel it.
And there are some specific

425
00:23:31,120 --> 00:23:35,320
things that ChatGPT can do, like
giving it one image and then

426
00:23:35,320 --> 00:23:38,480
saying, OK, now rotate this 90°
so you can have the background

427
00:23:38,720 --> 00:23:42,560
behind another character.
Now show the view of the lobby

428
00:23:42,680 --> 00:23:44,200
of what they would see from that
desk.

429
00:23:44,400 --> 00:23:47,480
ChatGPT does those things where
most of the other image tools

430
00:23:47,480 --> 00:23:48,760
don't.
Don't understand language enough

431
00:23:48,760 --> 00:23:51,040
to do that.
Right, right.

432
00:23:51,840 --> 00:23:55,120
But Mid Journey does stylistic,
does consistent characters.

433
00:23:55,320 --> 00:23:58,600
You can do consistent styles.
So if you give it one reference

434
00:23:58,600 --> 00:24:03,280
image of an illustration, I can
come up with 20 images that all

435
00:24:03,280 --> 00:24:07,080
have the same visual style and
colors that look like a whole

436
00:24:07,080 --> 00:24:11,120
set for a training.
I think long term we're gonna

437
00:24:11,120 --> 00:24:14,600
see bigger changes, but if
you're looking for the ways to

438
00:24:14,600 --> 00:24:18,080
get started right now, I think
image generation is one of the

439
00:24:18,080 --> 00:24:20,800
places that you can go and do
things right now.

440
00:24:20,960 --> 00:24:23,040
Yeah, and learn how to like,
talk and learn.

441
00:24:23,040 --> 00:24:24,000
How to do it?
Props.

442
00:24:24,160 --> 00:24:27,320
Yep.
And do and solve actual problems

443
00:24:27,320 --> 00:24:30,560
for yourself.
I also don't like using these to

444
00:24:30,560 --> 00:24:33,720
replicate the styles of living
artists.

445
00:24:33,760 --> 00:24:37,160
That's one of my other ethical
lines with AI image Gen.

446
00:24:38,920 --> 00:24:42,640
Yeah, we could definitely have a
whole conversation about that

447
00:24:42,680 --> 00:24:46,240
for sure, yes.
The ethics, the ethics of this I

448
00:24:46,480 --> 00:24:51,720
I personally you want to make a
parody Starry night great.

449
00:24:52,320 --> 00:24:55,280
No living artist is getting
harmed by your training.

450
00:24:55,280 --> 00:24:57,160
Customized starry night
painting.

451
00:24:57,240 --> 00:25:00,680
But I don't use it to recreate
things of living artists.

452
00:25:00,680 --> 00:25:02,560
That, for me, is the ethical
line.

453
00:25:02,560 --> 00:25:04,600
We'll definitely see what
happens in the future with all

454
00:25:04,600 --> 00:25:08,280
those different things as we're
coming up on the end of the

455
00:25:08,280 --> 00:25:10,720
episode, like where can people
find you?

456
00:25:10,720 --> 00:25:12,240
What are some of your final
thoughts?

457
00:25:12,240 --> 00:25:17,320
This has been jam packed with
tips, AI, you know, things to

458
00:25:17,320 --> 00:25:19,520
do, you freelancing, all those
different things.

459
00:25:19,520 --> 00:25:20,880
So where, where can people find
you?

460
00:25:20,880 --> 00:25:22,640
Where can they follow you?
Of course, we're gonna include

461
00:25:22,640 --> 00:25:24,520
everything in the show notes,
but I want you to tell them

462
00:25:24,520 --> 00:25:26,720
where to find you and yes, and
connect.

463
00:25:27,080 --> 00:25:29,560
Yes, absolutely.
And so you can definitely find

464
00:25:29,560 --> 00:25:32,680
me on LinkedIn.
My blog is Christy Tucker

465
00:25:32,680 --> 00:25:34,680
Learning.
If you can't remember how to

466
00:25:34,680 --> 00:25:38,320
spell Christie, it's You can
also do C Tucker learning.com,

467
00:25:38,320 --> 00:25:41,360
which is fewer letters.
Yeah, everybody spells it

468
00:25:41,360 --> 00:25:43,120
differently.
Right, exactly.

469
00:25:43,640 --> 00:25:46,080
Hopefully you're watching the
episode so you can see how to

470
00:25:46,080 --> 00:25:47,880
spell it.
Right, if you're watching, it'll

471
00:25:47,880 --> 00:25:50,640
be it'll be clear.
But if you are listening to this

472
00:25:50,640 --> 00:25:54,920
sign is the podcast in the car,
then C Tucker learning is a lot

473
00:25:54,920 --> 00:25:58,560
easier to remember.
And I'm on blue sky Christy

474
00:25:58,560 --> 00:26:03,960
Tucker there as well.
And so those are the a big place

475
00:26:03,960 --> 00:26:05,720
to find me.
Fantastic.

476
00:26:06,400 --> 00:26:08,920
I have a YouTube channel, I'm
just not really doing anything

477
00:26:08,920 --> 00:26:09,600
with it.
You can.

478
00:26:09,640 --> 00:26:11,920
Go yeah, YouTube is a challenge
for me.

479
00:26:11,920 --> 00:26:14,440
I'm trying to maybe we can
motivate each other to get the

480
00:26:14,440 --> 00:26:17,280
YouTube going.
I know it's, it's just it's one

481
00:26:17,280 --> 00:26:19,680
more channel channel and it's.
One more thing.

482
00:26:20,080 --> 00:26:22,240
Yeah, it's like you have the
podcast and so that's your

483
00:26:22,240 --> 00:26:24,320
regular thing.
I have, I have my blog and

484
00:26:24,320 --> 00:26:26,440
that's my regular thing, right?
So.

485
00:26:27,080 --> 00:26:31,120
Yeah, well, Christy, you're
amazing and I love talking to

486
00:26:31,120 --> 00:26:33,280
you and you give so much great
advice.

487
00:26:33,680 --> 00:26:35,840
Just want to say thanks again
for all you do for people in the

488
00:26:35,840 --> 00:26:37,800
L&D space.
I can't wait for people to hear

489
00:26:37,800 --> 00:26:39,680
this episode.
Everything's going to be in the

490
00:26:39,680 --> 00:26:42,120
show notes that Christy
mentioned, where to find her,

491
00:26:42,320 --> 00:26:45,280
where to connect, go to her
blog, follow her out on

492
00:26:45,280 --> 00:26:46,920
LinkedIn.
She's an influencer.

493
00:26:47,080 --> 00:26:48,920
So thank you so much for coming
back on the show.

494
00:26:48,920 --> 00:26:51,840
We appreciate it.
Yeah, thanks so much for giving

495
00:26:51,840 --> 00:26:55,400
me another opportunity to to
nerd out with you.

496
00:26:55,640 --> 00:26:58,880
Anytime.
Go like deep into well, there

497
00:26:58,880 --> 00:27:02,480
are specific tools for Here's my
nuanced views on these little

498
00:27:02,480 --> 00:27:05,200
Berry.
I love the insider perspective

499
00:27:05,200 --> 00:27:08,440
in that we can dive deeper into
that, and of course that leaves

500
00:27:08,440 --> 00:27:11,160
it open for future episodes of
different topics we can just

501
00:27:11,160 --> 00:27:13,160
talk about or LinkedIn live.
So we'll probably do that one

502
00:27:13,160 --> 00:27:14,240
day.
All right.

503
00:27:16,600 --> 00:27:19,520
Hi, we're ispring, an
international team of e-learning

504
00:27:19,520 --> 00:27:22,760
enthusiasts who help more than
60,000 clients across the globe

505
00:27:22,760 --> 00:27:24,920
succeed with better online
learning.

506
00:27:25,400 --> 00:27:28,800
Our two flagship solutions are
ispring Suite and ispring Learn

507
00:27:28,840 --> 00:27:31,560
LMS.
Ispring Suite is an intuitive,

508
00:27:31,600 --> 00:27:34,280
all in one authoring tool for
creating engaging e-learning

509
00:27:34,280 --> 00:27:37,480
content, while ispringlearn is
an innovative online training

510
00:27:37,480 --> 00:27:40,480
platform for onboarding,
upskilling and certifying your

511
00:27:40,480 --> 00:27:42,640
teams.
We'd be happy to get to know you

512
00:27:42,680 --> 00:27:44,640
and pick a solution that fits
your needs best.

513
00:27:45,040 --> 00:27:49,120
Go to www.ispringsolutions.com
to learn more about us and

514
00:27:49,120 --> 00:27:51,800
connect.
Thanks for spending a few

515
00:27:51,800 --> 00:27:54,120
minutes with Holly.
She knows your podcast queue is

516
00:27:54,120 --> 00:27:56,800
packed.
If today's episode sparked an

517
00:27:56,800 --> 00:28:00,960
idea or gave you that extra
nudge of confidence, tap, follow

518
00:28:01,200 --> 00:28:04,680
or subscribe in your favorite
app so you never miss an episode

519
00:28:04,680 --> 00:28:07,920
of Ed Up L&D.
Dropping a quick rating or

520
00:28:07,920 --> 00:28:11,440
review helps more educators and
learning pros discover the show,

521
00:28:11,440 --> 00:28:13,400
too.
Want to keep the conversation

522
00:28:13,400 --> 00:28:15,680
going?
Connect with Holly on LinkedIn

523
00:28:15,800 --> 00:28:17,280
and share your biggest take
away.

524
00:28:17,520 --> 00:28:20,840
She reads every message.
Until next time, keep learning,

525
00:28:20,920 --> 00:28:23,480
keep leading, and keep believing
in your own story.

526
00:28:23,960 --> 00:28:24,440
Talk soon.
Christy  Tucker Profile Photo

Christy Tucker

Learning Experience Design Consultant

Christy Tucker is a learning experience design consultant with 20 years of experience helping people learn. She specializes in using scenario-based learning to engage audiences and promote skill transfer to real-world environments. She has created training for clients including the National Alliance for Partnerships in Equity (NAPE), Cisco, FIRST, and NAFSA: Association of International Educators. Christy has been blogging about instructional design and elearning for over 15 years and is a regular speaker at industry conferences and events.