The player is loading ...
84: Can A.I. Save Writing? A Special Collaborative Episode with Eric James Stephens, Writing Sensei & Business Intelligence Developer

In 2022, Open AI launched ChatGPT. Overnight, the company started a global conversation about the future of artificial intelligence.

The emergence of AI has turned learning on its head. It has forced us to ask major questions:

-How does AI fit into the future of learning?
-How can AI make learning more personalized and adaptive?
-What are the ethical implications of AI?

This podcast will tackle these questions and more!


Follow Eric on LinkedIn

Thanks for tuning in! 🎧

 

1
00:00:01,200 --> 00:00:06,800
Hello, my name is Holly Owens
and welcome to Ed up edtech the

2
00:00:06,800 --> 00:00:10,500
podcast that keeps you.
In the know about all the latest

3
00:00:10,500 --> 00:00:14,300
edtech happenings.
We interview guests from around

4
00:00:14,300 --> 00:00:17,500
the globe to give you deeper
insights into the Ed tech

5
00:00:17,500 --> 00:00:21,500
industry, the field of
instructional design, and more,

6
00:00:21,800 --> 00:00:25,200
we're proudly a part of
America's leading podcast

7
00:00:25,200 --> 00:00:29,900
Network the Ed up, experience.
It's time to sit back and enjoy.

8
00:00:30,000 --> 00:00:33,000
Enjoy the latest episode of Ed
up attack.

9
00:00:39,300 --> 00:00:43,900
Hello and welcome to the first
collaborative podcast of the Ed

10
00:00:43,900 --> 00:00:48,000
up, AI podcast and the Ed up.
Ed Tech podcast with Holly

11
00:00:48,000 --> 00:00:50,700
Owens.
And our guest today that we're

12
00:00:50,700 --> 00:00:53,000
going to be talking to and
interviewing and learning

13
00:00:53,000 --> 00:00:56,300
everything about in terms of AI
and everything else is Eric.

14
00:00:56,300 --> 00:00:59,500
James Stevens.
Iarc welcome.

15
00:00:59,700 --> 00:01:01,800
Thank you, both, for having me
here.

16
00:01:01,800 --> 00:01:04,000
I am very excited to have this
conversation with you because

17
00:01:04,000 --> 00:01:07,000
this is just cool stuff.
Absolutely, it's going to be

18
00:01:07,000 --> 00:01:09,600
fun.
I'm super Sighted.

19
00:01:09,800 --> 00:01:14,800
And I can jump in with the first
question Eric and you can feel

20
00:01:14,800 --> 00:01:18,100
free to address it how you want
and work in what you're doing

21
00:01:18,100 --> 00:01:20,800
right now and everything else.
Cuz I know that right now, you

22
00:01:20,800 --> 00:01:24,900
have your own daily job that
you're doing full time and then

23
00:01:24,900 --> 00:01:29,300
you're also starting something
and are and an entrepreneur for

24
00:01:29,300 --> 00:01:32,100
rhizome.
So how would you describe what

25
00:01:32,100 --> 00:01:36,900
you're doing at your they job
with AI and then also with that

26
00:01:36,900 --> 00:01:40,700
other projects, okay?
I like that question the job

27
00:01:40,700 --> 00:01:42,900
that I have.
Now, I'm a business intelligence

28
00:01:42,900 --> 00:01:48,400
developer working with power bi.
And so this is a company that

29
00:01:48,400 --> 00:01:51,800
construction company that's been
around for over 100 years but

30
00:01:51,800 --> 00:01:54,000
now they're asking themselves,
like, what can we learn from our

31
00:01:54,000 --> 00:01:57,100
data?
And that's where I love to be

32
00:01:57,100 --> 00:02:00,100
in.
Love to work is figuring out

33
00:02:00,300 --> 00:02:02,100
complex problems.
What's that?

34
00:02:02,100 --> 00:02:04,800
A lot of fun?
I've had chat gbt open the whole

35
00:02:04,800 --> 00:02:08,699
time asking like what does this
mean in this environment or I

36
00:02:08,699 --> 00:02:10,199
used?
Do this at Tableau.

37
00:02:10,500 --> 00:02:12,000
How do I do it?
Power bi.

38
00:02:12,300 --> 00:02:14,500
And it will tell me the
different calculations that I

39
00:02:14,500 --> 00:02:16,700
need.
It's been a lot of fun, but

40
00:02:16,700 --> 00:02:19,300
that's not why we're here, we're
here because we're talking about

41
00:02:19,400 --> 00:02:22,400
Ai and star.
I am working on the sting of

42
00:02:22,400 --> 00:02:27,700
called project rise up and I am
building an AI powered teaching.

43
00:02:28,000 --> 00:02:33,100
I believe that one of the
biggest things that teachers

44
00:02:33,100 --> 00:02:36,900
need back in their lives is time
and one of the biggest time

45
00:02:36,900 --> 00:02:40,200
constraints that they have is
the S of grading writing

46
00:02:40,200 --> 00:02:43,100
specifically.
I want to be able to create

47
00:02:43,100 --> 00:02:47,900
something that helps teachers.
Do that task better and faster.

48
00:02:47,900 --> 00:02:51,800
In a way that students also
learn how to write better than

49
00:02:51,800 --> 00:02:57,300
they are right now as someone
who has been thinking about and

50
00:02:57,300 --> 00:02:59,500
writing about the ethical use of
Big Data.

51
00:02:59,500 --> 00:03:05,100
Since 2016, I feel like I I
cannot.

52
00:03:06,500 --> 00:03:08,600
Watch someone else build
something.

53
00:03:09,400 --> 00:03:12,700
And then critique it, if I was
an academic, that still what I

54
00:03:12,708 --> 00:03:15,200
would be doing is I wouldn't
have a saying but now that I

55
00:03:15,208 --> 00:03:18,200
know how to do data and now that
I know about these different

56
00:03:18,200 --> 00:03:21,500
things I want to be able to be
the one to build it to be able

57
00:03:21,500 --> 00:03:24,500
to say, hey let's bring in a
variety of people.

58
00:03:24,500 --> 00:03:27,500
Let's do these different things.
At this first phase of product,

59
00:03:27,500 --> 00:03:32,300
rhizome that were talking about
is a Grassroots data collection

60
00:03:32,300 --> 00:03:36,100
for student essays.
I want to create an AI that can

61
00:03:36,300 --> 00:03:42,400
Assess student writing, I
fundamentally, I'm against using

62
00:03:42,400 --> 00:03:46,100
data that's been scraped or
bopped from the web or from a

63
00:03:46,108 --> 00:03:49,700
University universities can sell
writing from their students.

64
00:03:49,700 --> 00:03:51,200
I don't think that's a good
thing.

65
00:03:51,300 --> 00:03:54,500
I don't think people know a lot
about that, they don't assume

66
00:03:54,500 --> 00:03:58,300
that and that's totally true.
That's definitely a need to

67
00:03:58,300 --> 00:04:01,600
know.
It is idea who owns an essay, so

68
00:04:01,600 --> 00:04:05,100
I believe that the person who
wrote it, owns it and I want to

69
00:04:05,100 --> 00:04:08,100
collect it from them.
I applied to this idea,

70
00:04:08,100 --> 00:04:10,500
accelerator called Builders and
backers, and they're being

71
00:04:10,500 --> 00:04:12,900
funded by Heartland forward.
Either be amazing people.

72
00:04:12,900 --> 00:04:15,600
Take you so much all those
people, but they gave me five

73
00:04:15,600 --> 00:04:19,000
thousand dollars in order to
conduct an experiment.

74
00:04:19,200 --> 00:04:23,300
Other people are hiring coders
to build things or their bill

75
00:04:23,300 --> 00:04:25,600
buying product.
I wanted to give that five

76
00:04:25,600 --> 00:04:28,900
thousand dollars away.
I'm giving away five thousand

77
00:04:28,900 --> 00:04:32,200
dollars to ten different people,
10 people get 500 bucks.

78
00:04:32,200 --> 00:04:35,200
I already gave away one person
hurting semper, she's awesome.

79
00:04:35,200 --> 00:04:38,400
Gave her 500 bucks last weekend
and this weekend I'm giving away

80
00:04:38,400 --> 00:04:42,600
to someone else $500.
I am asking people to post about

81
00:04:42,600 --> 00:04:43,700
it.
I'm trying to share it.

82
00:04:43,800 --> 00:04:46,700
I'd convinced my kids to pass
out flyers with me.

83
00:04:46,700 --> 00:04:49,700
Last night, I went over to, I
went to the University of Tulsa

84
00:04:49,700 --> 00:04:53,800
and we did break into library,
but we found somebody to let us

85
00:04:53,800 --> 00:04:56,100
into the library.
We just surpassed out flyers to

86
00:04:56,100 --> 00:05:02,500
people because I want this aai
and not be an individual person

87
00:05:02,500 --> 00:05:05,500
building, it has to be a
collaboration and it has to be

88
00:05:05,500 --> 00:05:08,700
done on ethically sourced data
and that's what I wanted.

89
00:05:09,900 --> 00:05:12,600
Well, this sounds amazing.
I'm so glad that were partnering

90
00:05:12,600 --> 00:05:17,400
for this episode because this is
totally Ed up at Tech jam

91
00:05:17,400 --> 00:05:20,900
without Eric this journey, I
know you boast three episodes

92
00:05:20,900 --> 00:05:23,700
and then add as they'll walk.
But now I'd it's back up and

93
00:05:23,700 --> 00:05:26,200
going I can't not.
Isn't that what we talked about

94
00:05:26,200 --> 00:05:28,900
when we were talking with Linda?
E is how that's how it works

95
00:05:28,900 --> 00:05:31,600
though.
You stop it and then you go back

96
00:05:31,600 --> 00:05:33,000
to and you go back to the
drawing board.

97
00:05:33,000 --> 00:05:37,300
So we're glad you're back.
You're back at this spot here.

98
00:05:37,300 --> 00:05:41,600
Tell us about more about what
people can do to help you

99
00:05:41,600 --> 00:05:44,400
collect this data.
The you mentioned, the 5,000,

100
00:05:44,400 --> 00:05:46,700
and the 500.
And obviously, we'll put

101
00:05:46,700 --> 00:05:48,900
everything in the show notes if
you're interested in

102
00:05:48,900 --> 00:05:52,000
contributing to Eric and the
data that he is collecting.

103
00:05:52,300 --> 00:05:56,300
So, tell us a little bit more
about that situation right now.

104
00:05:56,300 --> 00:06:01,000
There are really Things that
people can do to really help

105
00:06:01,000 --> 00:06:04,600
build one is the actual like
helping collection of essays.

106
00:06:04,600 --> 00:06:07,500
So if you have essays submit
them and I'm not I don't just

107
00:06:07,500 --> 00:06:10,600
need them from current students.
I'm about to get my perspective

108
00:06:10,600 --> 00:06:12,500
on my dissertation.
Exactly.

109
00:06:12,500 --> 00:06:15,400
So it's between 3 and 15 pages,
right?

110
00:06:15,400 --> 00:06:17,500
But it could be for any
discipline, right?

111
00:06:17,500 --> 00:06:20,500
Because and the reason that it's
called rise own as this is

112
00:06:20,900 --> 00:06:24,800
because I believe like writing
is a rhizomatic thing.

113
00:06:25,400 --> 00:06:27,400
A rhizome is something that
grows nebula.

114
00:06:27,500 --> 00:06:29,600
Ashley, there's no single point
of origin.

115
00:06:30,000 --> 00:06:32,000
Writing has no single point of
origin.

116
00:06:32,000 --> 00:06:37,900
There is an every discipline is
infested with this writing

117
00:06:38,000 --> 00:06:40,300
thing, any writing from any
class.

118
00:06:40,600 --> 00:06:44,400
I will take because I believe
fundamentally that what makes a

119
00:06:44,407 --> 00:06:48,000
good transition in a history
class is a good transition in a

120
00:06:48,000 --> 00:06:51,100
biology class, anybody who's
selling something otherwise is

121
00:06:51,100 --> 00:06:54,700
selling you something.
That was fun of it.

122
00:06:55,200 --> 00:06:58,700
But so there's the one there's
actually like collecting data or

123
00:06:58,700 --> 00:07:01,500
collecting assays.
I'm a this I'm hoping to

124
00:07:01,500 --> 00:07:04,900
graduate students are hearing me
well, except their writing and

125
00:07:04,900 --> 00:07:06,500
I'll accept their students
writing.

126
00:07:06,500 --> 00:07:10,300
They have the other thing is
that I'm collaborating with the

127
00:07:10,300 --> 00:07:13,600
Ed up experience as a whole with
joy Holden and we're doing a

128
00:07:13,608 --> 00:07:17,400
state of student writing survey
where professionals like

129
00:07:17,400 --> 00:07:20,700
yourselves education leaders
teachers can go out and share

130
00:07:20,700 --> 00:07:23,800
their opinions about what they
feel on the state of Writing.

131
00:07:24,100 --> 00:07:25,800
We're going to take those
results, are going to publish a

132
00:07:25,800 --> 00:07:27,400
white paper and that's going to
be fun.

133
00:07:28,000 --> 00:07:31,700
The Third Way that people can
help is by signing up to learn

134
00:07:31,700 --> 00:07:35,200
more.
This next phase right now is

135
00:07:35,500 --> 00:07:40,800
phase one that's getting my data
base to is training my data.

136
00:07:41,200 --> 00:07:43,900
Now I'm going to be sitting by
myself, reading whole bunch of

137
00:07:43,900 --> 00:07:47,300
sentences and Grading and
training an algorithm about how

138
00:07:47,300 --> 00:07:50,200
would Eric rate of paper.
Or I'm not even going to find a

139
00:07:50,207 --> 00:07:52,800
team of five people.
And I'm so we could say like,

140
00:07:52,800 --> 00:07:55,600
how would this group of five
people read these essays?

141
00:07:56,200 --> 00:07:59,600
I'm going to use a crowdsource
model where people can log-in.

142
00:07:59,600 --> 00:08:02,500
I could vet there, I can make
sure they have credentials and

143
00:08:02,500 --> 00:08:05,800
they should be assessing essays.
And then I'll give them, they'll

144
00:08:05,800 --> 00:08:09,700
be able to enter into this
platform and be able to help me

145
00:08:09,700 --> 00:08:12,200
train my sentences.
I'll give them a sentence.

146
00:08:12,200 --> 00:08:14,200
I'll ask them a question.
Be like, Mark, this?

147
00:08:14,200 --> 00:08:15,700
How, what do you feel about
this?

148
00:08:16,000 --> 00:08:20,700
And tell me why, and then my
goal is essentially every single

149
00:08:20,700 --> 00:08:25,300
essay that I get is going to be
Greg graded and red based

150
00:08:25,300 --> 00:08:28,400
against a rubric criteria,
right?

151
00:08:28,500 --> 00:08:32,200
And as a as an instructor, I
love that so much, right?

152
00:08:32,200 --> 00:08:37,500
And he singled I say
understanding each criteria is

153
00:08:37,500 --> 00:08:40,600
going to be great at twice,
double blind peer review.

154
00:08:41,400 --> 00:08:43,900
I have like, when you look at
the, I'd like the question.

155
00:08:44,100 --> 00:08:46,300
Does this paper have a good
transition?

156
00:08:47,200 --> 00:08:51,300
That question is probably 15
different questions about

157
00:08:51,300 --> 00:08:54,100
transitions that you as an
individual.

158
00:08:54,100 --> 00:08:58,600
Do not have the time to ask
every single paper.

159
00:08:59,700 --> 00:09:01,800
I can train an algorithm to do
that.

160
00:09:01,900 --> 00:09:05,100
Let's say that my Final count is
20 criteria.

161
00:09:05,500 --> 00:09:10,100
That means that every sentence
from every submission that I get

162
00:09:10,300 --> 00:09:15,900
will be read 40 times.
And that's the data that were

163
00:09:15,900 --> 00:09:18,500
feeding the, that's why is a
large language model?

164
00:09:18,700 --> 00:09:23,200
That's why it's big data is
because we're creating metadata,

165
00:09:23,500 --> 00:09:27,000
my contention, is that the
number one problem with all

166
00:09:27,600 --> 00:09:31,300
artificial intelligence at out
there today is not the

167
00:09:31,300 --> 00:09:34,400
algorithm.
It's the quality of data that

168
00:09:34,400 --> 00:09:36,800
they're using.
That's what I want to address

169
00:09:36,800 --> 00:09:40,000
fundamentally.
I'm going to need help creating

170
00:09:40,100 --> 00:09:43,900
all of that data so you can sign
up and help me do that.

171
00:09:44,100 --> 00:09:46,700
That comes, you can create an
account and Le box score boards

172
00:09:46,700 --> 00:09:48,500
and stuff, and that leadership
boards will be fun.

173
00:09:49,100 --> 00:09:51,100
Yeah, that will be fun gamified
a bit.

174
00:09:51,100 --> 00:09:53,200
Make some competition out of it,
I love it.

175
00:09:53,400 --> 00:09:56,200
Exactly.
Not my brain is going a thousand

176
00:09:56,200 --> 00:09:58,200
different directions.
Every time we chat, that's what

177
00:09:58,200 --> 00:09:59,700
happens.
But that's a good thing.

178
00:09:59,900 --> 00:10:04,000
I'm thinking about, in terms of
AI has become everybody's an

179
00:10:04,000 --> 00:10:06,500
expert right now, and it's very
new.

180
00:10:06,500 --> 00:10:11,200
It's a very new frontier.
So as you are embracing this

181
00:10:11,200 --> 00:10:15,800
entrepreneurship into the AI
space, how How are you dealing

182
00:10:15,800 --> 00:10:17,400
with that?
And like your life and

183
00:10:17,400 --> 00:10:19,600
navigating?
All these are people, like I

184
00:10:19,600 --> 00:10:22,500
know this and a eyes bad here
because New York City public

185
00:10:22,500 --> 00:10:25,300
schools has already banned.
It's not allowed there or

186
00:10:25,300 --> 00:10:27,400
institutions of are abandoned.
How are you?

187
00:10:27,600 --> 00:10:30,500
Navigating that space?
As you're stepping into this

188
00:10:30,500 --> 00:10:36,100
entrepreneur a journey I think
part of it is recognizing when

189
00:10:36,100 --> 00:10:40,200
to step I mean in early January
I was reaching out to Jason.

190
00:10:40,200 --> 00:10:42,800
We were talking about fuss do a
series of different things and I

191
00:10:42,808 --> 00:10:46,200
was like, okay this to me I was
Ramping up because I like to be

192
00:10:46,200 --> 00:10:48,400
a voice.
I like to be on the stage and I

193
00:10:48,400 --> 00:10:50,600
just saw exactly what you
mentioned.

194
00:10:52,000 --> 00:10:57,700
Is that this is just going so
fast and so, there's so much.

195
00:10:58,300 --> 00:11:02,400
You cannot be an expert.
I think the only true experts on

196
00:11:02,400 --> 00:11:06,100
AI that exist, are the experts
that were talking about AI

197
00:11:06,200 --> 00:11:09,200
before November 20, 20?
Those are the people that we

198
00:11:09,200 --> 00:11:11,300
should really be thinking were
attempted to.

199
00:11:11,600 --> 00:11:15,600
But here's what I think, the
reason that there are so many

200
00:11:15,600 --> 00:11:23,200
experts in AI, Is because so
many experts exist, an AI just

201
00:11:23,200 --> 00:11:28,900
makes an expert better.
A lawyer, who knows how to use

202
00:11:28,900 --> 00:11:33,800
chat, GPT is not really going to
do well, teaching a marketing

203
00:11:34,100 --> 00:11:37,900
person or marketer how to use
chat CPT for her skills.

204
00:11:38,500 --> 00:11:41,100
Right?
A car mechanic is going to use

205
00:11:41,100 --> 00:11:43,600
chap TP differently than a
doctor.

206
00:11:44,200 --> 00:11:47,300
But the point is it that this is
The Equalizer.

207
00:11:47,700 --> 00:11:52,200
We are giving everybody Access
to information at your

208
00:11:52,200 --> 00:11:54,700
fingertips.
As rather than going in trying

209
00:11:54,700 --> 00:11:58,400
to compete, what everybody else
is doing, I just want to go and

210
00:11:58,400 --> 00:12:03,400
support what everybody else is
doing knowing confidently.

211
00:12:04,500 --> 00:12:06,600
That there is no one else that
is going to build what I'm

212
00:12:06,600 --> 00:12:10,800
building is everybody else is
worrying about algorithms and

213
00:12:10,800 --> 00:12:19,700
I'm focusing on data.
That's awesome and I don't know

214
00:12:19,900 --> 00:12:22,800
I want to ask this question.
Algorithms are so important

215
00:12:22,800 --> 00:12:24,700
especially on social media
sites.

216
00:12:24,700 --> 00:12:26,400
I'm going to hold back on that
one real quick.

217
00:12:26,400 --> 00:12:29,200
I'll let Jason Jump in, I'm
going to hold back on the

218
00:12:29,200 --> 00:12:31,700
algorithm question, the vs the
data.

219
00:12:31,700 --> 00:12:35,800
So Jason, you'd pop in here.
I definitely want to come to

220
00:12:35,800 --> 00:12:38,400
that and there are go with
dings.

221
00:12:38,600 --> 00:12:42,400
So as you were talking Eric, I
had these clusters in my mind

222
00:12:42,400 --> 00:12:45,100
that I was creating as you were
walking you through.

223
00:12:45,200 --> 00:12:49,900
Through a project rhizome is
actually doing the first cluster

224
00:12:49,900 --> 00:12:53,300
is how your project is a
microcosm.

225
00:12:53,300 --> 00:12:57,300
In many ways of a lot of the
issues and concerns and Concepts

226
00:12:57,400 --> 00:13:00,600
that were thinking about with a.
I when we think about

227
00:13:00,600 --> 00:13:08,400
transparency and ethics and who
is managing knowledge bases.

228
00:13:08,400 --> 00:13:12,900
If we go into chat gvt,
sometimes it's really hard to

229
00:13:12,900 --> 00:13:15,100
figure out where that knowledge
is coming.

230
00:13:15,300 --> 00:13:19,200
In from and who's controlling
that data or sourcing that data,

231
00:13:19,500 --> 00:13:21,100
I think you got to change as we
go forward.

232
00:13:21,100 --> 00:13:28,300
But it's always been a question
built into the use of AI and the

233
00:13:28,300 --> 00:13:31,000
other concept that you brought
out is you're talking about

234
00:13:31,000 --> 00:13:33,700
project.
Rhizome is social building,

235
00:13:34,000 --> 00:13:36,500
right?
And really creating in public

236
00:13:36,500 --> 00:13:41,100
and using people, and I think
associated with that connected

237
00:13:41,100 --> 00:13:45,100
to that is this question of what
happens to the expert.

238
00:13:45,200 --> 00:13:48,900
But in the age of a, I think
there's a lot of anxiety about

239
00:13:48,900 --> 00:13:53,400
that and I was literally reading
a book titled, the new laws of

240
00:13:53,400 --> 00:13:59,500
robotics by Frank Pasquale, and
that's what he talks about.

241
00:13:59,500 --> 00:14:03,100
His argument is actually that AI
makes the expert more valuable,

242
00:14:03,300 --> 00:14:08,700
but actually gives them more
value, socially culturally, and

243
00:14:08,800 --> 00:14:12,100
hopefully economically.
That's one of the con clusters

244
00:14:12,300 --> 00:14:15,800
that I started to create out of
your talking, the other Idea

245
00:14:15,800 --> 00:14:18,300
that you brought up, which
really spoke to me.

246
00:14:18,300 --> 00:14:20,700
Personally, I know, Holly is
someone who teaches all the

247
00:14:20,700 --> 00:14:23,000
time, just like Lee spoke to
you, too.

248
00:14:23,200 --> 00:14:28,900
It's just how much time you can
save and how you can repurpose

249
00:14:28,900 --> 00:14:32,800
or time to do, very high impact
things.

250
00:14:32,800 --> 00:14:36,800
For my perspective, that is
fundamentally true.

251
00:14:37,100 --> 00:14:42,500
It used to take me four hours to
create a personalized rubric for

252
00:14:42,500 --> 00:14:45,100
class.
It now takes me four minutes.

253
00:14:45,200 --> 00:14:48,100
It's it.
You turn take me and I minute of

254
00:14:48,100 --> 00:14:51,600
it.
Yeah. 15 minutes to write

255
00:14:51,600 --> 00:14:53,700
student feedback and everything
else.

256
00:14:53,800 --> 00:14:57,500
It now takes me about a minute
and a half, right?

257
00:14:57,500 --> 00:15:01,200
Editing, a podcast.
Used to take me two hours.

258
00:15:01,400 --> 00:15:06,500
It now takes me eight minutes.
That's like that were, that's

259
00:15:06,500 --> 00:15:08,100
what we're talking about.
That's why we're talking about

260
00:15:08,100 --> 00:15:11,800
the level of time-saving.
And so, a project, like project

261
00:15:11,800 --> 00:15:14,900
rhizome, at least for me is a
model.

262
00:15:15,200 --> 00:15:18,800
Others can use for how you can
ethically sourced data and be

263
00:15:18,808 --> 00:15:21,800
transparent about where it's
coming from and all of that.

264
00:15:21,800 --> 00:15:25,100
I want to follow this up with a
question about project dry Zone

265
00:15:25,400 --> 00:15:29,700
and I think it's connected to a
lot of those Concepts that you

266
00:15:29,700 --> 00:15:32,800
brought up, but then also your
focus on assessment.

267
00:15:33,500 --> 00:15:39,500
So how personalizable is it?
Because you mentioned using data

268
00:15:39,600 --> 00:15:42,400
from different disciplines
different levels.

269
00:15:42,500 --> 00:15:45,000
So is their way, the project
rhizome.

270
00:15:45,200 --> 00:15:49,400
Um would be targeting that
feedback for a student of a

271
00:15:49,408 --> 00:15:51,700
particular skill level say,
they're still struggling with

272
00:15:51,700 --> 00:15:54,300
something, it's not a student
who's say a freshman in college

273
00:15:54,300 --> 00:15:58,600
versus a senior in college.
Is that feedback tailored or how

274
00:15:58,600 --> 00:16:01,200
it was that working in terms of
your individual project?

275
00:16:02,100 --> 00:16:05,700
Yeah.
So I'll say this, one thing to,

276
00:16:05,700 --> 00:16:11,400
I think that the way that you
described How you heard these

277
00:16:11,400 --> 00:16:14,100
different themes.
And talking about clustering is

278
00:16:14,100 --> 00:16:16,900
exactly why artificial
intelligence is intelligence and

279
00:16:16,900 --> 00:16:18,900
human life.
Because that is what artificial

280
00:16:18,900 --> 00:16:21,200
intelligence does.
It looks at a whole bunch of

281
00:16:21,200 --> 00:16:25,400
content and then it clusters it
by topic and then predicts based

282
00:16:25,400 --> 00:16:28,300
on that clustering, what it
should say that is the exact

283
00:16:28,300 --> 00:16:31,400
same thought process it.
That is why it's called a neural

284
00:16:31,500 --> 00:16:33,900
network.
It's mimicking neurons and so

285
00:16:33,900 --> 00:16:35,500
that's just a really cool thing
for me.

286
00:16:35,700 --> 00:16:38,900
I'm glad that you did that to
address your question, one of

287
00:16:38,900 --> 00:16:41,400
the data points.
That I'm collecting is the

288
00:16:41,400 --> 00:16:44,000
level.
So I'm accepting writing from

289
00:16:44,000 --> 00:16:48,000
any discipline when a user goes
in and submits their writing I'm

290
00:16:48,000 --> 00:16:51,300
asking them a whole bunch of
questions about their writing so

291
00:16:51,300 --> 00:16:55,100
that I can then tabulated that
and find Trends and patterns

292
00:16:55,100 --> 00:16:57,300
later.
So one of those questions is

293
00:16:57,700 --> 00:17:00,300
what is the name of the class?
Is this, a history class is a

294
00:17:00,300 --> 00:17:01,800
biology class that kind of
thing?

295
00:17:02,100 --> 00:17:03,900
The other one is at what level
is it is.

296
00:17:03,908 --> 00:17:08,099
I'm accepting 9th grade writing
through doctoral level writing,

297
00:17:08,200 --> 00:17:12,200
like through post-grad because
So I believe this fundamentally

298
00:17:13,000 --> 00:17:17,300
that the core principles of a
writing class and a postdoc is

299
00:17:17,300 --> 00:17:19,500
the exact same lesson plan.
You're going to get in 9th

300
00:17:19,500 --> 00:17:22,200
grade.
They're the same thing just the

301
00:17:22,200 --> 00:17:26,500
complexity changes but the core
principles are what writing is

302
00:17:26,800 --> 00:17:29,800
stays the same?
That's why writing is a meta

303
00:17:29,800 --> 00:17:33,900
discipline when we get to phase
3 or 4 when we get to the stage

304
00:17:33,900 --> 00:17:36,600
of developing the personalized
feedback and probably tapping

305
00:17:36,600 --> 00:17:39,400
some like other large language
models.

306
00:17:39,600 --> 00:17:42,500
Do that makes it sound better
like chatty PT or something.

307
00:17:43,000 --> 00:17:46,600
But being able to say, okay,
like what, how would you model

308
00:17:46,600 --> 00:17:51,000
this feedback if the student was
in ninth grade versus in

309
00:17:51,000 --> 00:17:52,500
postdoctoral?
They might be a bike, not be

310
00:17:52,500 --> 00:17:54,900
able to bit more short with
them, over to the point.

311
00:17:55,600 --> 00:18:00,400
The, because we're collecting
this data about the writing

312
00:18:00,400 --> 00:18:05,000
itself, will be able to find
patterns, we will be able to

313
00:18:05,000 --> 00:18:09,400
bike me said before, ask AI to
see what clusters exist.

314
00:18:09,500 --> 00:18:12,800
So that we may not see right
there may be other clusters or

315
00:18:12,800 --> 00:18:15,300
themes and what I said but we
didn't see them because we're

316
00:18:15,300 --> 00:18:17,800
not artificial intelligent and
looking at everything over time.

317
00:18:19,300 --> 00:18:24,800
That's why Collecting the data
is so important.

318
00:18:25,400 --> 00:18:29,400
One of the largest data models
are one of the largest data sets

319
00:18:29,400 --> 00:18:31,700
that a lot of these large
language models are being

320
00:18:32,000 --> 00:18:36,400
trained on are from high school
essays written for the SATs, you

321
00:18:36,400 --> 00:18:40,100
can actually go and find a spout
database of twelve to fourteen

322
00:18:40,100 --> 00:18:44,100
thousand student, written essays
by the reason chat T.T like

323
00:18:44,100 --> 00:18:47,300
gives you a standard five
paragraph essay.

324
00:18:47,800 --> 00:18:50,500
When you just type in a regular
question is because it was

325
00:18:50,500 --> 00:18:53,700
trained.
On high school, 5, paragraph

326
00:18:53,700 --> 00:18:56,100
essays.
And they were being trained and

327
00:18:56,100 --> 00:19:00,500
graded by people who preferred
long-winded writing.

328
00:19:01,800 --> 00:19:06,500
So the reason chat Jiggy T is
for boasts, is because of the

329
00:19:06,500 --> 00:19:12,300
data and the metadata about it.
I don't believe this might be

330
00:19:12,300 --> 00:19:13,900
getting on a tangent now.
Right?

331
00:19:14,100 --> 00:19:17,000
I don't think that the solution
to our problems is to embed

332
00:19:17,000 --> 00:19:19,700
English 101 into a biology
class.

333
00:19:20,800 --> 00:19:24,400
I don't think English 101, is a
good practice of writing because

334
00:19:24,400 --> 00:19:26,800
when you get to the business
world, you're taught how to be

335
00:19:26,800 --> 00:19:29,900
more concise like you're taught
to be using.

336
00:19:29,900 --> 00:19:31,800
What's called plain language,
right?

337
00:19:31,800 --> 00:19:34,300
So there's the Obama passed, the
plain language act 2010.

338
00:19:35,800 --> 00:19:37,900
That's what I want to embed,
right?

339
00:19:37,900 --> 00:19:40,900
I was talking to people who
teaching mbas and they say one

340
00:19:40,900 --> 00:19:42,900
of their biggest things that
teaching their students, how to

341
00:19:42,900 --> 00:19:45,800
write less be more concise,
tours, it better.

342
00:19:46,400 --> 00:19:48,800
And that's what how we train
chat, Chiquitita, better Outlook

343
00:19:48,800 --> 00:19:51,700
to make this answer shorter.
It's like when off on a big

344
00:19:51,700 --> 00:19:55,500
tangent there but your answer to
that question.

345
00:19:56,000 --> 00:19:59,400
Bay's two or three there's going
to be that personalized level of

346
00:19:59,400 --> 00:20:03,600
feedback based on a discipline
and based on level.

347
00:20:04,100 --> 00:20:08,500
But rather than Starting with
that discipline or that level

348
00:20:08,500 --> 00:20:10,200
and saying this is how each
sound.

349
00:20:10,700 --> 00:20:14,700
We're asking the data first how
it should sound, so reacting to

350
00:20:14,708 --> 00:20:17,900
the data and not coming in with
our own biases and begin.

351
00:20:19,500 --> 00:20:23,600
And I like it.
So, I was going to ask, there's

352
00:20:23,600 --> 00:20:28,000
a lot of pushback about this,
like using this AI stuff and

353
00:20:28,500 --> 00:20:31,500
obviously not in this room,
we're all just, let's do it.

354
00:20:31,500 --> 00:20:33,600
Let's use it.
Let's have fun with it.

355
00:20:33,600 --> 00:20:36,200
Let's see what we're doing.
How would you approach the

356
00:20:36,200 --> 00:20:38,600
situation of the people who are
already resisting?

357
00:20:38,600 --> 00:20:43,200
This sort of situation where
students writing is going to be

358
00:20:43,700 --> 00:20:45,500
filtered through a system and
then they're going to get

359
00:20:45,500 --> 00:20:48,200
feedback.
How would you deal with the

360
00:20:48,200 --> 00:20:51,300
reserve resistant?
That is on the rise again.

361
00:20:51,400 --> 00:20:54,900
Again, with technology in this
new innovation with AI in chat

362
00:20:54,900 --> 00:20:59,000
GPT, how would you approach that
in the seat that you're in right

363
00:20:59,000 --> 00:21:01,800
now and deal with that?
That's a loaded question.

364
00:21:01,800 --> 00:21:06,000
By the way, back in the day,
back in the day, couple thousand

365
00:21:06,000 --> 00:21:12,200
years ago, Leto was a ranting
and raving right about the

366
00:21:12,200 --> 00:21:16,600
creation of a new technology
that would ruin thinking.

367
00:21:17,900 --> 00:21:19,500
He was talking about the
alphabet.

368
00:21:26,000 --> 00:21:29,400
Every single time, something new
happens.

369
00:21:30,800 --> 00:21:35,700
People will not like it and
they'll disagree with it.

370
00:21:36,400 --> 00:21:40,300
And I think that has far more to
do with one's experience in life

371
00:21:40,300 --> 00:21:43,300
and where they are in life and
how much they're willing to

372
00:21:43,300 --> 00:21:45,800
learn something new versus not
new.

373
00:21:47,100 --> 00:21:50,800
That determines that.
So when I encounter someone

374
00:21:50,800 --> 00:21:54,900
who's I'm not gonna do that, my
response is okay.

375
00:21:56,500 --> 00:21:58,500
I'll see you in five years.
When you shout like, they're

376
00:21:58,500 --> 00:22:02,800
gonna, I know that because I
have already seen this actively

377
00:22:03,500 --> 00:22:08,400
happening where I have a friend,
he's a CEO of a company and he

378
00:22:08,400 --> 00:22:12,300
hires people, and before he
would hire a teachers to create

379
00:22:12,300 --> 00:22:18,000
content for him now, he hires
one teacher that uses chat GPT

380
00:22:18,200 --> 00:22:25,000
to create content for him.
The people who are saying this

381
00:22:25,000 --> 00:22:27,000
is not something I need to worry
about.

382
00:22:28,000 --> 00:22:32,800
Do not understand how
fundamentally intertwined their

383
00:22:32,800 --> 00:22:37,300
lives, already are with AI and
how they are going to be

384
00:22:37,300 --> 00:22:40,800
intertwined with a, I just in
the product of sweets with

385
00:22:40,800 --> 00:22:45,000
Microsoft and Google that now
hat, their copilot, and they're

386
00:22:45,000 --> 00:22:48,600
barred embedded in.
Whatever we got, call it you

387
00:22:48,600 --> 00:22:55,700
cannot Escape it and if you're
actively resisting it, You are

388
00:22:55,800 --> 00:22:58,700
actively putting yourself at
risk.

389
00:22:59,000 --> 00:23:01,900
I am a huge fan like you need to
be an early adopter.

390
00:23:03,000 --> 00:23:06,000
If only to be asking those
heart, ethical questions to be

391
00:23:06,000 --> 00:23:07,800
getting.
So you're not dealing with

392
00:23:07,800 --> 00:23:09,800
changing a procedures.
Number of be the one to

393
00:23:09,800 --> 00:23:12,600
implement the change.
Not the one that has to deal

394
00:23:12,600 --> 00:23:16,400
with someone else, child me to
change and so I guess that I

395
00:23:16,400 --> 00:23:19,500
think there's always reason to
be cautious, but I think that

396
00:23:20,600 --> 00:23:24,200
we're all good crap that we talk
about our age demographic of

397
00:23:24,200 --> 00:23:26,900
everything that we've I've lived
through like we can go through a

398
00:23:26,908 --> 00:23:30,100
lot like Wars and teres
attachment so many things.

399
00:23:32,600 --> 00:23:38,500
Holy moly.
We are living through a modern

400
00:23:38,500 --> 00:23:42,500
Industrial Revolution that is
happening. 10 times as fast

401
00:23:42,500 --> 00:23:45,600
where you can see daily change
instead of monthly or yearly

402
00:23:45,600 --> 00:23:49,100
change, we're going to seize bit
out of like an update.

403
00:23:49,200 --> 00:23:52,000
You don't have to say, it's got
to be in the SAS and it's got a

404
00:23:52,008 --> 00:23:55,400
spit out an update.
Yeah, it's instantaneous.

405
00:23:56,500 --> 00:23:59,900
I think that there is going to
be amazing things that happen.

406
00:23:59,900 --> 00:24:02,800
I believe that artificial
intelligence will be the

407
00:24:02,800 --> 00:24:04,800
introduction of a permanent
four-day work week.

408
00:24:05,100 --> 00:24:07,400
Like we like artificial
intelligence will show us that

409
00:24:07,600 --> 00:24:11,200
working 40 hours a week is no
longer required and we can have

410
00:24:11,200 --> 00:24:15,200
more Leisure Time and happiness.
There's a lot of good have going

411
00:24:15,200 --> 00:24:18,000
to come out.
I really do and we can go

412
00:24:18,000 --> 00:24:23,300
dystopia if you want to and it's
gonna get there but also man,

413
00:24:23,300 --> 00:24:28,700
it's beautiful.
I used to brought up so much

414
00:24:28,700 --> 00:24:32,300
Eric that I want to talk about
the first is your point which I

415
00:24:32,300 --> 00:24:34,700
think is so pivotal for the
higher ed sector.

416
00:24:34,700 --> 00:24:39,500
It's that AI has been around for
a long for a while.

417
00:24:39,500 --> 00:24:42,700
This kind of tech has been
around, will really change with

418
00:24:42,700 --> 00:24:45,000
Shao.
Qi, PT, is that it was pushed

419
00:24:45,000 --> 00:24:50,600
into the public imagination and
the ux was made so much more

420
00:24:50,600 --> 00:24:52,800
accessible.
That's the big thing.

421
00:24:52,800 --> 00:24:55,900
That Chad, TBT did creating the
chat function.

422
00:24:55,900 --> 00:25:02,900
So it was almost About the you.
That's design of it, then the

423
00:25:02,900 --> 00:25:07,000
actual Tech and it's so worked
into our lives.

424
00:25:07,400 --> 00:25:09,100
And so are going to it that we
almost.

425
00:25:09,100 --> 00:25:11,200
And even notice there are a lot
of us, don't even notice it was

426
00:25:11,200 --> 00:25:13,300
happening.
Then the other thing you brought

427
00:25:13,300 --> 00:25:16,700
up and seemed to suggest was
that Tech like chat?

428
00:25:16,700 --> 00:25:23,800
Gbt is the tip of the iceberg.
He's our a just early moved.

429
00:25:23,800 --> 00:25:28,000
So I like to think about what
people might think, 50 years

430
00:25:28,000 --> 00:25:30,800
from now.
And one of the The things I

431
00:25:30,800 --> 00:25:33,800
think that they will do is 50
years from now.

432
00:25:34,200 --> 00:25:36,300
Children are going to be in
school and they're finished.

433
00:25:36,300 --> 00:25:41,200
Show them chat GPT and they're
going to be horrified about how

434
00:25:41,200 --> 00:25:44,600
bad it is.
How awful a product that is?

435
00:25:44,600 --> 00:25:47,300
Instead Altman is also talked
about this by the way.

436
00:25:47,500 --> 00:25:49,500
He said that.
Just so you know, she actually

437
00:25:49,500 --> 00:25:52,600
be cheeps kind of awful as a
user experience.

438
00:25:52,600 --> 00:25:55,000
It's down all the time you have
to.

439
00:25:55,000 --> 00:25:59,200
Now spend all of these hours
figuring out how to use it and

440
00:25:59,200 --> 00:26:01,600
super it Hank.
He's less example Sager.

441
00:26:01,600 --> 00:26:06,900
This is awful Prada but it
really gave us something that we

442
00:26:06,900 --> 00:26:09,400
can play with and I think that
50 years from now.

443
00:26:09,600 --> 00:26:11,900
It's good.
Hey guys going to be so much

444
00:26:11,900 --> 00:26:14,800
more advanced in terms of being
user friendly so you won't have

445
00:26:14,800 --> 00:26:19,400
to as a lot of us.
Did I did spend weeks really

446
00:26:19,400 --> 00:26:22,800
play with it to be like oh I
gotta use it now and I just

447
00:26:22,800 --> 00:26:26,900
going to continue to advance and
it's advancing every day and I

448
00:26:26,908 --> 00:26:29,800
want to follow it up with a
question Eric.

449
00:26:29,800 --> 00:26:33,700
That's Nected to that and you
mentioned and I think you're

450
00:26:33,700 --> 00:26:38,000
absolutely right.
That AI is an industrial

451
00:26:38,000 --> 00:26:40,800
revolution.
That's changing every day every

452
00:26:40,800 --> 00:26:43,500
week.
How do you stay on top of it.

453
00:26:43,600 --> 00:26:46,000
What do you follow?
What do you look at?

454
00:26:46,100 --> 00:26:49,800
How do you feel like you're at
least abreast of what's

455
00:26:49,800 --> 00:26:51,700
Happening?
I'll say that.

456
00:26:51,700 --> 00:26:56,100
I don't feel like I am because
just like you said, because

457
00:26:56,100 --> 00:27:00,200
Chance CPT is such an easy user
interface.

458
00:27:00,600 --> 00:27:05,200
And because anybody can access
it using their own expertise,

459
00:27:05,400 --> 00:27:09,300
there are just dozens of
applications that I'm seeing

460
00:27:09,300 --> 00:27:13,700
every day.
I love watching tic toc like a

461
00:27:13,700 --> 00:27:17,400
lot of the idea generation that
I get is from other people who

462
00:27:17,400 --> 00:27:18,900
get on and share, what they're
doing.

463
00:27:19,600 --> 00:27:23,200
I'm pretty active on LinkedIn,
I'm constantly scouring.

464
00:27:23,400 --> 00:27:26,900
The news is social media, like I
consume a lot of information

465
00:27:27,100 --> 00:27:29,700
from a variety of different
resources and so I think that

466
00:27:29,700 --> 00:27:33,800
helps it A lot, I'm a big fan of
the content just in Feinberg I

467
00:27:33,808 --> 00:27:37,700
think his name is and Rachel
Woods on Tick-Tock specifically

468
00:27:37,700 --> 00:27:40,900
the AI exchange newsletter.
I think is like the go-to place

469
00:27:41,000 --> 00:27:43,400
to get information.
I also have a really good friend

470
00:27:43,400 --> 00:27:47,200
of mine, his name is Ben but we
talk about data.

471
00:27:48,000 --> 00:27:50,000
He's the one that helped me with
my dissertation research.

472
00:27:50,000 --> 00:27:52,300
During use the data scientist
that I collaborated, with me and

473
00:27:52,300 --> 00:27:54,100
Katie.
So, I don't know if that was

474
00:27:54,100 --> 00:28:00,700
like a I think the answer to
your questions can be applicable

475
00:28:00,700 --> 00:28:01,900
for the people who are
listening.

476
00:28:03,100 --> 00:28:08,100
Is that the best thing to do is
to start listening and to start

477
00:28:08,100 --> 00:28:11,200
playing, don't ignore those
articles that you're seeing and

478
00:28:11,200 --> 00:28:13,800
seeing that it's everywhere.
Go and see it and then go and

479
00:28:13,800 --> 00:28:19,000
sit, down and chat gbt and try
it. at work that I have like my

480
00:28:19,000 --> 00:28:27,800
full-time gig that I have, its I
love being able to show people

481
00:28:28,300 --> 00:28:31,500
face-to-face, what this
technology can do and just watch

482
00:28:31,500 --> 00:28:35,300
the AA happen, then have them
come up to me, like my boss

483
00:28:35,300 --> 00:28:38,800
asked about, can you make me
limerick about a leprechaun who

484
00:28:38,800 --> 00:28:40,400
wants raisins?
It is, carrot cake, and I was

485
00:28:40,400 --> 00:28:42,700
like, that's what it takes to
make you happy, man.

486
00:28:42,900 --> 00:28:45,300
Yes, I will.
I get a raise after that.

487
00:28:45,300 --> 00:28:47,600
Yeah.
And then like, I had rigged The

488
00:28:47,600 --> 00:28:50,400
Vaping.
Yeah, of course, my friends

489
00:28:50,400 --> 00:28:52,800
quitting her job and she's write
a resignation letter Indiana's

490
00:28:52,800 --> 00:28:54,600
like a zoo were on it already.
Boom.

491
00:28:54,600 --> 00:28:59,500
Boom, let me email that to you
like it's so much fun to see

492
00:29:00,100 --> 00:29:06,700
people's own curiosity and
expertise emerge when they use

493
00:29:06,700 --> 00:29:08,500
it themselves and just to see
bike.

494
00:29:08,500 --> 00:29:15,100
How Just imagine how much more
creative and beautiful things

495
00:29:15,100 --> 00:29:18,900
are going to be.
Because we have the ability that

496
00:29:19,000 --> 00:29:21,600
someone who I don't consider
myself artistic, can go and make

497
00:29:21,600 --> 00:29:25,200
something beautiful.
I think that's, I think that's

498
00:29:25,200 --> 00:29:27,600
amazing.
And I love it that you say, you

499
00:29:27,600 --> 00:29:29,100
have to go, you have to play
with it.

500
00:29:29,100 --> 00:29:31,500
I feel like people be before
they write it off.

501
00:29:31,500 --> 00:29:35,200
That's the thing you need to do
is just go play with it and ask

502
00:29:35,200 --> 00:29:37,100
it.
Like you're you're seriously

503
00:29:37,100 --> 00:29:40,200
like just typing in a question
or two.

504
00:29:40,400 --> 00:29:43,400
We need to do something.
It's what we do every day owner

505
00:29:43,400 --> 00:29:45,800
and if the computer you're
telling it to do things that you

506
00:29:45,808 --> 00:29:49,200
needed to do, so it's mimicking
what we're already doing.

507
00:29:49,200 --> 00:29:53,800
So it shouldn't be scary for
people so I want to know what

508
00:29:53,800 --> 00:29:56,900
are you like currently besides
writing limericks and things and

509
00:29:56,900 --> 00:29:59,800
resignation letters.
What are you currently playing

510
00:29:59,800 --> 00:30:02,600
with right now?
In Ai and he like test cases

511
00:30:02,600 --> 00:30:04,400
things that you want to share
with the audience that you're

512
00:30:04,400 --> 00:30:06,200
working on.
I think we should all share like

513
00:30:06,200 --> 00:30:09,900
how we're were using this tool
in our lives.

514
00:30:10,400 --> 00:30:15,100
For me, I'm I'm trying to build
a tech company in the leanest

515
00:30:15,100 --> 00:30:20,200
way possible and so when I had
questions I'm like I'm pretty

516
00:30:20,200 --> 00:30:27,000
good at generating content, what
I'm bad at is formal writing and

517
00:30:27,000 --> 00:30:28,900
formal things I need to do,
right?

518
00:30:28,900 --> 00:30:32,000
So what I can do is I want to
thank that with all of your

519
00:30:32,000 --> 00:30:35,800
education and your background.
Did he really do?

520
00:30:35,800 --> 00:30:39,400
It doesn't mean anything.
That's the thing, that's the key

521
00:30:39,400 --> 00:30:44,300
there is I It's not how to do it
and put it out there, but the

522
00:30:44,300 --> 00:30:48,200
think the path at the passions,
are I feel like that about some

523
00:30:48,200 --> 00:30:50,600
scholar like scholarly writing
is like that a lot.

524
00:30:50,600 --> 00:30:53,100
It's force a bit, not your
choice on that.

525
00:30:53,100 --> 00:31:00,400
I think that being able to have
an expert being able to pay chat

526
00:31:00,400 --> 00:31:03,400
gbt, I know you're not a lawyer,
you can get around it, but hey,

527
00:31:03,400 --> 00:31:05,700
I have an appointment, my lawyer
next week, but I have a question

528
00:31:05,700 --> 00:31:08,100
in the meantime.
That's a really great way to get

529
00:31:08,100 --> 00:31:11,900
around their caveat is that they
give you So I have, what do I do

530
00:31:11,900 --> 00:31:12,900
here?
What does this mean?

531
00:31:13,600 --> 00:31:16,500
And then it explains it to me.
I'm working in the construction

532
00:31:16,500 --> 00:31:20,500
industry.
I have no idea how to build a

533
00:31:20,500 --> 00:31:24,100
building or what people use for
different terms when they're I

534
00:31:24,100 --> 00:31:26,000
had.
No, I'm looking at accounting

535
00:31:26,000 --> 00:31:27,700
data.
I've never worked at accounts

536
00:31:27,700 --> 00:31:29,600
payable before.
I've never done that.

537
00:31:30,400 --> 00:31:34,500
I have Chachi PTO bananas, say
hey what does this mean?

538
00:31:35,200 --> 00:31:39,200
Or someone just said this in
this context, what should I do?

539
00:31:41,300 --> 00:31:46,100
As someone who is neurodivergent
and I have a bipolar disorder

540
00:31:46,100 --> 00:31:49,100
and I love, I can be
overwhelming.

541
00:31:49,200 --> 00:31:52,600
Yes, my enthusiasm is wonderful
for a podcast at work.

542
00:31:52,600 --> 00:31:58,300
I can be an overwhelming person.
This lets me control that

543
00:31:58,300 --> 00:32:01,900
insatiable curiosity.
I think the more you recognize

544
00:32:02,400 --> 00:32:06,400
that it's not a Google search
where you put something in and

545
00:32:06,400 --> 00:32:10,400
you get a final output that you
are talking to someone.

546
00:32:10,400 --> 00:32:13,800
It's a Chat, you refine the
conversation.

547
00:32:14,100 --> 00:32:17,000
You can't go up to a stranger
and give them a command.

548
00:32:17,000 --> 00:32:18,900
As they write me a press release
for this.

549
00:32:19,200 --> 00:32:20,800
They're going to turn around be
like, what the fuck are you

550
00:32:20,800 --> 00:32:23,200
talking about?
No, you turn around to like,

551
00:32:23,200 --> 00:32:25,100
hey, how's it going?
This is me.

552
00:32:25,100 --> 00:32:29,100
You introduce yourself, but you
give context the more you treat

553
00:32:29,100 --> 00:32:32,700
artificial intelligence as its
own actor.

554
00:32:34,000 --> 00:32:37,400
The better can't the better.
You're going to get the outputs

555
00:32:37,400 --> 00:32:39,100
are going to get sorry.
There are score.

556
00:32:39,300 --> 00:32:42,900
Well bleep that out.
If I think they might like it

557
00:32:43,000 --> 00:32:46,900
adds emphasis, I don't know.
I've no actress and leaving that

558
00:32:46,900 --> 00:32:47,900
out.
I'm Gonna Keep it anymore.

559
00:32:48,200 --> 00:32:51,100
I'm not gonna find a an anchor.
I probably have to put the

560
00:32:51,100 --> 00:32:55,800
explicitly, molar, whatever, or
I will track it and got me down,

561
00:32:55,800 --> 00:33:01,600
who knows?
And I love merits of your idea.

562
00:33:02,500 --> 00:33:07,400
I love your idea of using AI as
a tutor, one of the things that

563
00:33:07,400 --> 00:33:12,700
has happened to me and I know
this must sound meta is, AI has

564
00:33:12,700 --> 00:33:17,500
allowed me to get into AI.
There are all these Concepts out

565
00:33:17,500 --> 00:33:21,300
there that as I read and look at
everything on social media and

566
00:33:21,300 --> 00:33:23,400
learn and I follow Justin
Feinberg.

567
00:33:23,500 --> 00:33:26,500
If follow Rachel, would I learn
a lot from them and every once

568
00:33:26,500 --> 00:33:29,200
in a while up, Curious about
something.

569
00:33:29,400 --> 00:33:34,200
And in the past 10 years ago I
would have written a noted a

570
00:33:34,200 --> 00:33:36,300
notebook somewhere.
Sometimes I'd look it up,

571
00:33:36,300 --> 00:33:39,100
sometimes I wouldn't.
Now what I do is I take my

572
00:33:39,100 --> 00:33:43,700
device, I go immediately to poet
is my go-to right now because

573
00:33:43,700 --> 00:33:47,200
it's a very, it's a very easy
way to have everything in one

574
00:33:47,200 --> 00:33:49,400
spot and I'll ask it to teach
me.

575
00:33:49,400 --> 00:33:53,700
So I was reading the other day
about a i in ground truth it was

576
00:33:53,700 --> 00:33:55,400
just thrown out there that comes
up.

577
00:33:55,500 --> 00:34:00,000
And so I was able to go in and
use Gbt through Poe and just a

578
00:34:00,000 --> 00:34:04,400
skit, talk Chief me about this
thing doesn't just doing web

579
00:34:04,400 --> 00:34:08,900
search because I was able to
through the prom, be able to

580
00:34:08,900 --> 00:34:11,900
tailor it to myself.
I was able to say, explain to me

581
00:34:11,900 --> 00:34:16,600
like you would a ten-year-old
and give me at least one analogy

582
00:34:16,600 --> 00:34:21,900
that will allow me to grasp onto
it, and doing that in the past,

583
00:34:21,900 --> 00:34:23,699
they've been very difficult or
impossible.

584
00:34:23,699 --> 00:34:26,500
If I looked it up at all, I
would have just ended up on

585
00:34:26,500 --> 00:34:31,699
dictionary.com or Wikipedia,
which I'm still Blown Away by

586
00:34:31,699 --> 00:34:35,000
how in many ways inaccessible?
Wikipedia is on the level of

587
00:34:35,000 --> 00:34:37,800
language.
I'm constantly going in there to

588
00:34:37,800 --> 00:34:41,699
learn something Technical and on
bombarded with technical

589
00:34:41,699 --> 00:34:44,100
language, even though it's
recompete.

590
00:34:44,100 --> 00:34:47,699
Yeah, but even from there, I get
lost and I don't know what to do

591
00:34:48,100 --> 00:34:50,699
now.
It's a, I'm able to learn about

592
00:34:50,699 --> 00:34:53,699
Ai and so becomes a.
So it allows me to get into

593
00:34:53,699 --> 00:34:56,900
discipline actually learn what
I'm talking about in a little

594
00:34:56,900 --> 00:34:58,700
was going on.
No.

595
00:34:58,700 --> 00:35:04,500
I so I was sitting down with my
brother and he is he's getting

596
00:35:04,500 --> 00:35:07,300
real like next week actually
sitting down for his PMP exam.

597
00:35:07,600 --> 00:35:10,300
And so and we were like he was
telling me about like way that

598
00:35:10,300 --> 00:35:12,400
he's used it before and like
he's just like trying to figure

599
00:35:12,400 --> 00:35:15,700
out like how he can go and
prepare for his exam.

600
00:35:16,000 --> 00:35:18,400
He's also sharing with me that
the way he likes to learn

601
00:35:18,800 --> 00:35:20,700
because you like that.
So you can tailor, I like to

602
00:35:20,700 --> 00:35:23,100
learn by a now, they give me
some analogies the way the he

603
00:35:23,100 --> 00:35:25,700
likes to learn is hey, here's
this real world application and

604
00:35:25,700 --> 00:35:28,200
I have, how can I apply it?
So it's like just like We're

605
00:35:28,200 --> 00:35:29,800
just like on the cuff.
Right?

606
00:35:29,800 --> 00:35:32,900
And I was like, what if you were
to go in a chat GPT and say,

607
00:35:33,100 --> 00:35:38,600
hey, I am taking an exam and two
weeks here are some specific and

608
00:35:38,600 --> 00:35:42,200
you give them like the specific
scenario that you're at work.

609
00:35:42,200 --> 00:35:46,300
That's a real life example.
And then you say using the PMP

610
00:35:46,300 --> 00:35:51,700
guide, create questions for me,
that helps me study, that will

611
00:35:51,700 --> 00:35:54,400
help me understand this thing at
work.

612
00:35:54,700 --> 00:35:58,000
Easy peasy.
As soon as you realize that it's

613
00:35:58,100 --> 00:36:01,600
Not just like accessing
information but it's asking it

614
00:36:01,600 --> 00:36:05,500
to combine and synthesize
information for you, it's really

615
00:36:05,500 --> 00:36:08,500
beautiful.
What you can do it.

616
00:36:10,500 --> 00:36:17,200
I know how to say this, exactly.
But I have been pulled into this

617
00:36:17,200 --> 00:36:21,900
thing.
That is data and language.

618
00:36:21,900 --> 00:36:30,200
I have been steeping in data and
language or It's it was in 2014

619
00:36:32,200 --> 00:36:35,600
when I first decided that my
methodology for my dissertation

620
00:36:36,200 --> 00:36:41,300
was would be big data analysis.
And I've been thinking about

621
00:36:41,400 --> 00:36:44,000
ethics of it since then I've
been thinking about what I would

622
00:36:44,000 --> 00:36:48,800
do and I was at Justin thinking
and thinking and it is was with

623
00:36:48,800 --> 00:36:51,300
the Advent like the reason that
Things fall off last year,

624
00:36:51,300 --> 00:36:52,600
right?
For a lot of different reasons

625
00:36:53,200 --> 00:36:56,100
but the reason it's being picked
up right now is because I was

626
00:36:56,100 --> 00:36:59,100
able to see what other people
were able to do with technology

627
00:36:59,600 --> 00:37:02,900
and that's what I hope that
people walk away from this whole

628
00:37:02,900 --> 00:37:06,800
conversation is to go in and
don't feel like you're being

629
00:37:06,800 --> 00:37:08,800
behind.
Don't feel like you're behind

630
00:37:08,800 --> 00:37:10,900
because you're already.
Behind right?

631
00:37:11,100 --> 00:37:13,700
That's like trying to say I'm
behind in biology.

632
00:37:13,900 --> 00:37:16,400
Of course, I am in cellular
biology since the 10th grade.

633
00:37:16,600 --> 00:37:19,000
Why would I be on top of
biology?

634
00:37:19,300 --> 00:37:22,100
You're gonna be behind.
The best thing that you can do

635
00:37:22,100 --> 00:37:25,500
is just be utterly amazed about
what other people are doing and

636
00:37:25,500 --> 00:37:29,800
using that to inspire yourself.
Like I did, we're not for me.

637
00:37:29,800 --> 00:37:31,600
It wasn't.
Oh, I can write this prompt

638
00:37:31,600 --> 00:37:33,800
there with me.
I can go start this company.

639
00:37:34,300 --> 00:37:36,600
I know how to do it and know
what I need to do.

640
00:37:37,400 --> 00:37:42,200
And I am in power bi artificial
intelligence to do it.

641
00:37:44,000 --> 00:37:48,600
That's yeah, absolutely.
And I want to talk about from

642
00:37:48,600 --> 00:37:50,500
the ID perspective, the
instructional designer

643
00:37:50,500 --> 00:37:55,500
perspective, how much time this
saves using Chad GPT or other AI

644
00:37:55,500 --> 00:37:58,800
to write outlines or just get an
idea of a storyboard because we

645
00:37:58,800 --> 00:38:02,300
spend so much time
conceptualizing based off the

646
00:38:02,300 --> 00:38:06,300
content that the sneeze give us
how this is all going to flow.

647
00:38:06,400 --> 00:38:09,900
They'll give us a PowerPoint but
that's not necessarily in The

648
00:38:09,900 --> 00:38:13,700
Logical order or the order
should be in for learning.

649
00:38:13,800 --> 00:38:17,200
And so I use chat dtp a lot just
to outline stuff and I know

650
00:38:17,200 --> 00:38:19,700
people say that and it helps so
much.

651
00:38:19,700 --> 00:38:23,000
It just gives you like a sense
of relief that you don't have to

652
00:38:23,000 --> 00:38:25,300
go through this super critical
process.

653
00:38:25,300 --> 00:38:28,300
You can refocus your efforts on
the creative side of

654
00:38:28,300 --> 00:38:30,700
instructional design, and
developing the interactive's and

655
00:38:30,700 --> 00:38:32,200
things.
But with the Learners to

656
00:38:32,200 --> 00:38:35,500
behavioral changes and the
assessments that are needed to

657
00:38:35,500 --> 00:38:37,800
exhibit that.
So from an instructional

658
00:38:37,800 --> 00:38:41,000
designer perspective, this is
something that I can't wait

659
00:38:41,100 --> 00:38:44,000
until people.
We start incorporating These

660
00:38:44,000 --> 00:38:47,300
inter tools Microsoft is putting
in the tools like they're

661
00:38:47,300 --> 00:38:50,200
articulates and the Dobies of
the world start putting these

662
00:38:50,200 --> 00:38:54,800
into the tools to make it so
much more manageable and I feel

663
00:38:54,800 --> 00:38:57,700
like we're going to level up as
humans with a I like we're

664
00:38:57,700 --> 00:38:59,700
already doing it.
We're going to just we're going

665
00:38:59,700 --> 00:39:03,100
to think more critically on
higher levels because of this.

666
00:39:04,300 --> 00:39:08,600
So here is in my mind, a reality
that will happen once the

667
00:39:08,600 --> 00:39:13,300
hardware patches up or if once
things are catching up and this

668
00:39:13,300 --> 00:39:16,000
is what this is, what makes it
so scary, right?

669
00:39:16,000 --> 00:39:18,100
That people have peed like a
eyes, it would come and replace

670
00:39:18,100 --> 00:39:20,800
my job is because you have
people like me thinking like

671
00:39:20,800 --> 00:39:22,600
this.
And here's what people need to

672
00:39:22,600 --> 00:39:24,700
do is like you should be asking
these questions.

673
00:39:25,000 --> 00:39:26,700
You should be the ones,
implementing these things in

674
00:39:26,700 --> 00:39:29,800
your organization, because
here's what's going to happen in

675
00:39:30,300 --> 00:39:31,600
five years.
Right?

676
00:39:31,600 --> 00:39:34,300
A company is going to have
Artificial intelligence

677
00:39:34,300 --> 00:39:36,800
connected it to its entire
knowledge base.

678
00:39:37,400 --> 00:39:40,700
You're gonna have a new hire
come in and say hey this is how

679
00:39:40,700 --> 00:39:44,700
I like to learn.
You even need an instructional

680
00:39:44,700 --> 00:39:47,600
designer anymore say, I like, to
learn this way.

681
00:39:47,600 --> 00:39:50,000
Can you teach me about this
process?

682
00:39:51,000 --> 00:39:56,600
And now you have an interactive
instructional designer that will

683
00:39:56,600 --> 00:40:01,700
answer your questions.
Personalized based on the

684
00:40:01,700 --> 00:40:05,400
knowledge base from the company,
based on how you like to learn.

685
00:40:05,500 --> 00:40:08,300
If you like to learn via
problems are you like to learn

686
00:40:08,300 --> 00:40:11,400
via funny videos?
But you can say, make me a funny

687
00:40:11,400 --> 00:40:14,500
video starring Bruce Willis that
Has me about this principle.

688
00:40:16,500 --> 00:40:21,400
That's there.
All the pieces are there and

689
00:40:21,400 --> 00:40:23,000
that's what everybody's going to
be think.

690
00:40:23,000 --> 00:40:25,600
That's when 50 years from now.
They're going to be looking back

691
00:40:25,600 --> 00:40:27,800
and thinking like wow, Josh apt
for.

692
00:40:27,800 --> 00:40:29,400
What is that?
Just like my kid is going to

693
00:40:29,400 --> 00:40:32,600
look at like the Motorola 120 e,
razor fall off your bike with

694
00:40:32,600 --> 00:40:35,100
peanut phone that up.
It was really fancy to that of

695
00:40:35,100 --> 00:40:36,700
blue screen instead of a green
screen.

696
00:40:36,900 --> 00:40:41,900
Yeah.
Anyway, it's just so much fun

697
00:40:41,900 --> 00:40:44,500
stuff out there and I hope that
people are not intimidated by

698
00:40:44,500 --> 00:40:50,400
it, and they feel excited about
it because you should be agreed.

699
00:40:52,100 --> 00:40:53,400
Yeah.
And I think that, and you

700
00:40:53,400 --> 00:40:55,900
mentioned this before, and it
there are just around 40 minutes

701
00:40:55,900 --> 00:40:57,200
Arc.
I do want to on your time, and I

702
00:40:57,207 --> 00:41:01,100
know that you have to go, Holly
have to go play with it, right?

703
00:41:01,100 --> 00:41:05,600
If you're out there, you're
higher ed or any field, really?

704
00:41:06,200 --> 00:41:10,300
If you're concerned, if you're
worried that is okay, those are

705
00:41:10,300 --> 00:41:13,700
emotionally valid responses.
Make sure you get in and you

706
00:41:14,000 --> 00:41:18,000
really play with it.
And I'll be totally honest, or

707
00:41:18,000 --> 00:41:21,700
I'm emotionally, I have good
days, I have bad days with a, i

708
00:41:21,800 --> 00:41:24,100
Some days.
I think, yes, this is going to

709
00:41:24,100 --> 00:41:26,600
allow me to do X, Y, and Z
faster better.

710
00:41:26,600 --> 00:41:29,500
So on and so forth.
And other days, I'm very

711
00:41:29,500 --> 00:41:32,400
negative about it and I go back
and forth and I've learned to be

712
00:41:32,400 --> 00:41:36,300
emotionally, okay, with that,
depending on my own State, what

713
00:41:36,300 --> 00:41:41,400
I'm working with, I think that
for a lot of Educators, we need

714
00:41:41,400 --> 00:41:43,700
to do, we need to do our due
diligence, which means playing

715
00:41:43,700 --> 00:41:46,600
with it, playing with the
software, really reflecting on

716
00:41:46,600 --> 00:41:50,900
it and being okay with not
feeling, totally, consistent

717
00:41:50,900 --> 00:41:53,900
emotionally, Lee with the tag.
I think that a lot of people are

718
00:41:53,900 --> 00:41:55,900
there, and it's okay to be there
too.

719
00:41:56,200 --> 00:41:57,300
All right.
And I, so I do want to honor

720
00:41:57,300 --> 00:41:57,900
your time.
Eric.

721
00:41:57,900 --> 00:42:01,200
So, very last question, if
someone listening to this

722
00:42:01,200 --> 00:42:04,900
episode wants to talk to, you
wants to have a conversation or

723
00:42:04,900 --> 00:42:10,200
reach out to you for any reason,
I should they put 100% LinkedIn

724
00:42:10,300 --> 00:42:13,500
is where I live by.
You could also check out the

725
00:42:13,500 --> 00:42:18,800
project, its www, dot project,
rhizome.com, and you can get a

726
00:42:18,800 --> 00:42:23,700
whole bunch of stuff there too.
Fantastic.

727
00:42:24,600 --> 00:42:27,200
I love it.
I'm so glad you came back to

728
00:42:27,200 --> 00:42:28,800
this.
I'm so glad you came back to

729
00:42:28,800 --> 00:42:30,800
this.
I am very grateful for the two

730
00:42:30,800 --> 00:42:36,500
of you truly or not only having
this conversation at being able

731
00:42:36,500 --> 00:42:38,800
to promote and all that kind of
stuff.

732
00:42:39,200 --> 00:42:43,800
But I think like it is this
community that we built on

733
00:42:43,800 --> 00:42:47,200
LinkedIn over the past year or
two years, whatever that I'm

734
00:42:47,200 --> 00:42:50,500
just very grateful for.
And I don't think that I could

735
00:42:50,500 --> 00:42:57,600
have approached the problem that
I am trying to approach if it

736
00:42:57,600 --> 00:43:02,200
were not for the constant,
iterative reactions and refining

737
00:43:02,200 --> 00:43:05,700
of ideas with people, that would
not give you.

738
00:43:05,700 --> 00:43:08,500
So again, I'm just very
grateful, thank you, both for

739
00:43:08,800 --> 00:43:13,900
doing absolutely anytime.
Okay, you're so welcome.

740
00:43:13,900 --> 00:43:16,000
And thank you so much our for
coming on the show.

741
00:43:16,100 --> 00:43:19,700
Coming on the first joint
podcast and good luck with

742
00:43:19,700 --> 00:43:21,700
everything.
And it's been a pleasure hear

743
00:43:21,700 --> 00:43:24,400
you talk with a.
I Thank you so much.

744
00:43:29,700 --> 00:43:34,000
You've just experienced an
another amazing episode of Ed

745
00:43:34,000 --> 00:43:35,100
up.
Ed Tech.

746
00:43:35,700 --> 00:43:41,100
Be sure to visit our website at
Ed up, edtech.com to get all the

747
00:43:41,100 --> 00:43:43,900
updates on the latest edtech
happening.

748
00:43:44,600 --> 00:43:46,000
See you next time.