
Overview
In the second part of this two-part episode, Dr. Seixas and I talk about the future of tech and how we're going to be bold and strive for excellence. Enjoy the second part of this episode.
More About Dr. Azizi Seixas in his own words
Innovator. Scientist. Thought leader. Technologist. Sleep and Circadian Sciences expert. Creator of Precision and Personalized Population Health. Voted top 100 most inspiring Black Scientists in America by Cell Press. I believe in health and wellness for all through disruptive and scalable innovation.
How to Connect Connect with Dr. Seixas on LinkedIn & Follow The MIL on Social Media LinkedIn | Facebook | Instagram | YouTube | Twitter
____________________________
Connect with the host: Holly Owens
Audio editor: Daniel Stein
EdUp EdTech - We make EdTech Your Business!
Thanks for tuning in!
Thanks for joining us on today’s episode of EdUp EdTech! If you enjoyed today’s episode, please head to our website and leave us a rate and review to help us reach even more fantastic audience members like you. Don’t forget to check out our website, visit us on LinkedIn, or hang out with us on Facebook or Instagram to stay up-to-date on the latest EdTech happenings.
Thanks for tuning in! 🎧
00:00:01,200 --> 00:00:06,800
Hello, my name is Holly Owens
and welcome to Ed up edtech the
2
00:00:06,800 --> 00:00:10,500
podcast that keeps you.
In the know about all the latest
3
00:00:10,500 --> 00:00:14,300
edtech happenings.
We interview guests from around
4
00:00:14,300 --> 00:00:17,500
the globe to give you deeper
insights into the Ed tech
5
00:00:17,500 --> 00:00:21,500
industry, the field of
instructional design, and more,
6
00:00:21,800 --> 00:00:25,200
we're proudly a part of
America's leading podcast
7
00:00:25,200 --> 00:00:29,900
Network the Ed up, experience.
It's time to sit back and enjoy.
8
00:00:30,000 --> 00:00:33,000
Enjoy the latest episode of Ed
up attack.
9
00:00:41,800 --> 00:00:44,700
Welcome to part 2 of the episode
with dr.
10
00:00:44,700 --> 00:00:48,200
Seiches founding director of The
Mill.
11
00:00:48,500 --> 00:00:53,200
And this part of the episode.
We talk about things that have
12
00:00:53,200 --> 00:01:05,099
to do with the future, enjoy.
We're not just doing it alone.
13
00:01:05,600 --> 00:01:08,700
We're not beating our chest
saying that we have got it.
14
00:01:08,700 --> 00:01:13,200
Figured out what we are saying
is that we're going to be bored
15
00:01:13,700 --> 00:01:18,600
about striving for more that we
cannot just think of this as
16
00:01:18,600 --> 00:01:21,700
business as usual and just
operate as if we're trying to
17
00:01:21,700 --> 00:01:24,400
satisfy the lowest common
denominator.
18
00:01:24,800 --> 00:01:29,600
No, we need to strive, we need
to find ways in which we can
19
00:01:29,600 --> 00:01:34,000
find Benchmark.
Whereby we Strive for
20
00:01:34,000 --> 00:01:37,000
excellence.
But if Excellence that that is
21
00:01:37,000 --> 00:01:40,300
not just a boat.
Hey, look at what we're doing
22
00:01:40,300 --> 00:01:44,200
because smart we are but it's
look at what we're doing for
23
00:01:44,200 --> 00:01:49,300
others and to me that's that's
the greatest thing its service.
24
00:01:49,500 --> 00:01:53,000
And to me that's what we're
focused on is why we're
25
00:01:53,200 --> 00:01:58,200
Unapologetic a boat we recognize
that there is this common
26
00:01:58,200 --> 00:02:01,500
denominator that you need to
bring people into hock to
27
00:02:01,500 --> 00:02:03,600
hospitals and Healthcare.
Skin.
28
00:02:04,000 --> 00:02:08,100
But particularly for us at the
University of Miami, we are
29
00:02:08,100 --> 00:02:13,900
situated in a very important
historically, important
30
00:02:13,900 --> 00:02:17,000
community in the entire United
States all over tone.
31
00:02:17,400 --> 00:02:20,700
It's one of the first
predominantly black
32
00:02:20,700 --> 00:02:24,100
neighborhoods in Florida and in
the United States.
33
00:02:24,600 --> 00:02:30,100
But unfortunately based on years
of policies, bad policies on the
34
00:02:30,100 --> 00:02:33,000
light that there have been
disinvestment.
35
00:02:33,700 --> 00:02:38,100
I'm so for me, as one of the
leaders, at the University of
36
00:02:38,100 --> 00:02:41,500
Miami, and I know this is shared
by our Dean on others is, we
37
00:02:41,500 --> 00:02:46,900
can't just be in that Community.
We have to be for the community
38
00:02:47,000 --> 00:02:50,900
with the community, right?
And so, how do we do that?
39
00:02:50,900 --> 00:02:52,600
We can't just do it through lip
service.
40
00:02:52,800 --> 00:02:57,100
We have to be doing it similar
to how technologists build
41
00:02:57,100 --> 00:03:01,800
Solutions human-centered design.
How do we create programs that
42
00:03:01,800 --> 00:03:05,500
our community?
Enter designed individually
43
00:03:05,500 --> 00:03:08,000
based design.
And that's what we're trying to
44
00:03:08,000 --> 00:03:10,100
do.
That's our Holy Grail of
45
00:03:10,100 --> 00:03:13,000
innovation and technology.
So if it only works for a few
46
00:03:13,000 --> 00:03:15,400
then, okay, let's find a way how
it can work for more people.
47
00:03:15,900 --> 00:03:19,300
Yeah, 100%.
That's that's fantastic.
48
00:03:19,400 --> 00:03:21,500
I'm sure within all these
pillars, you have a lot of
49
00:03:21,500 --> 00:03:23,800
different projects that are
going on right now.
50
00:03:23,800 --> 00:03:27,500
Is there anything that you have
that you can share and talk
51
00:03:27,500 --> 00:03:30,400
about for the upcoming year?
Yeah, thank you.
52
00:03:30,700 --> 00:03:33,900
Thank you so much.
Things of what you're doing?
53
00:03:33,900 --> 00:03:35,100
What are your goals?
Yeah.
54
00:03:35,400 --> 00:03:38,700
And the roadmap that I'm about
to share, it's really more of an
55
00:03:38,700 --> 00:03:42,300
invitation to your audience.
As I said, we were very humble.
56
00:03:42,300 --> 00:03:45,400
Oh, we're bullish about it.
But we're also very humble, we
57
00:03:45,400 --> 00:03:48,100
can't do it by ourselves.
And this is an invitation for
58
00:03:48,100 --> 00:03:51,900
people to join this movement.
Let's turn this moment into a
59
00:03:51,900 --> 00:03:56,700
movement.
So for example, education we are
60
00:03:56,700 --> 00:04:01,600
galvanizing about 30 to 40
different startup and technology
61
00:04:01,600 --> 00:04:05,000
companies where we're to be
having apprenticeships for, or
62
00:04:05,000 --> 00:04:08,800
students.
So my request is many of the
63
00:04:08,800 --> 00:04:12,400
innovators on Founders and
leaders out there, come talk to
64
00:04:12,400 --> 00:04:17,000
us because we believe that the
next MDS and scientists will be
65
00:04:17,000 --> 00:04:19,800
Chief technology officers are
going to be the chief medical
66
00:04:19,800 --> 00:04:23,500
officers or if not Founders,
help us to train these fools.
67
00:04:23,500 --> 00:04:28,200
Help them help us to build old
on ecosystem, where it's not
68
00:04:28,200 --> 00:04:32,400
just technology train, folks,
who are part of the development
69
00:04:32,400 --> 00:04:35,000
of Looks with everyone, so
that's education.
70
00:04:35,000 --> 00:04:36,400
So we're really not that
program.
71
00:04:36,400 --> 00:04:38,100
So we're very excited about
that.
72
00:04:38,300 --> 00:04:42,900
In terms of research, we have
several studies from rolling out
73
00:04:42,900 --> 00:04:44,700
our Mill.
The mill boxes are remote,
74
00:04:44,700 --> 00:04:47,700
Health, monitoring solution with
1,500 participants,
75
00:04:48,100 --> 00:04:51,400
African-American black
individuals and Latinos in
76
00:04:51,400 --> 00:04:55,300
rural, and urban areas because
we got to have to get rid of
77
00:04:55,300 --> 00:04:59,100
this urban-rural divide as well.
And we're providing this remote
78
00:04:59,100 --> 00:05:01,700
Health monitoring solution
collecting data over a seven day
79
00:05:01,700 --> 00:05:04,700
period.
And creating digital twins
80
00:05:05,300 --> 00:05:08,500
working with companies like
Amazon on this project.
81
00:05:08,500 --> 00:05:11,400
This is critical that you
mention that because I work in a
82
00:05:11,900 --> 00:05:16,100
Pharmacy Watson, here you go.
So they did.
83
00:05:16,100 --> 00:05:18,700
So we got to connect because
that's one of the things that I
84
00:05:18,700 --> 00:05:20,900
know some of those folks there
as well too.
85
00:05:21,800 --> 00:05:24,800
But I think this is weird.
We have to provide this
86
00:05:24,800 --> 00:05:28,200
wraparound supports and I know
big tech companies like on
87
00:05:28,200 --> 00:05:30,500
Amazon.
They've tried to get into health
88
00:05:30,500 --> 00:05:32,500
care and haven't done so
successfully.
89
00:05:32,700 --> 00:05:35,200
It's not too bad, you know, it's
not your fault.
90
00:05:35,200 --> 00:05:39,300
It's not Amazon spoil because
Amazon, the DNA of Amazon is not
91
00:05:39,300 --> 00:05:42,100
to take care of patients.
That's our job, but if we
92
00:05:42,100 --> 00:05:47,000
partner together, where we
amplify our respective strengths
93
00:05:47,000 --> 00:05:49,800
and that's where we have
something special, right?
94
00:05:49,800 --> 00:05:52,100
And so that's kind of one ear
that we're working on.
95
00:05:52,400 --> 00:05:55,000
Plus, we're really not some
really nice projects that
96
00:05:55,200 --> 00:05:58,600
provide maternal mental health,
to Mom's you in a particular.
97
00:05:58,600 --> 00:06:02,500
Black-on-brown, mom's using VR
working with a company called
98
00:06:02,600 --> 00:06:06,500
Behavior doing that type of work
as well as some other companies.
99
00:06:06,500 --> 00:06:09,300
I don't want to leave anyone out
and really trying to find some
100
00:06:09,300 --> 00:06:11,900
new digital biomarkers as well
too.
101
00:06:12,000 --> 00:06:16,700
And making Healthcare more
frictionless contact list and
102
00:06:16,700 --> 00:06:20,500
dare I say deviceless.
And so that's kind of what we're
103
00:06:20,500 --> 00:06:24,200
aiming for and really building
on our digital therapeutic.
104
00:06:24,200 --> 00:06:29,000
We want the my vision is we want
the University of Miami to be
105
00:06:29,000 --> 00:06:32,200
one of the first digital
Therapeutic Health assistance,
106
00:06:32,700 --> 00:06:34,400
It doesn't mean we're getting
rid of doctors.
107
00:06:34,900 --> 00:06:38,700
It just means that many of us as
clinicians.
108
00:06:38,700 --> 00:06:42,000
And my friends who are currently
seeing patients, they want to be
109
00:06:42,000 --> 00:06:44,900
there for their patience beyond
the 15 minutes that they get
110
00:06:44,900 --> 00:06:48,000
with them.
So maybe we can build digital
111
00:06:48,000 --> 00:06:52,700
Solutions and Therapeutics to
provide this wrap around 24/7
112
00:06:52,700 --> 00:06:58,200
support for patients, right?
When patients need us and this
113
00:06:58,200 --> 00:07:01,300
is, you know, can be
asynchronous or synchronous or
114
00:07:01,300 --> 00:07:04,200
on-demand whatever. / modality,
we want to choose.
115
00:07:04,500 --> 00:07:07,700
That's what we're trying to
build out as well.
116
00:07:07,700 --> 00:07:11,400
And and in terms of service, I
know Reach digital literacy
117
00:07:11,400 --> 00:07:13,800
program, and we're very bullish
about.
118
00:07:13,800 --> 00:07:16,900
It were about to finalize a
partnership.
119
00:07:16,900 --> 00:07:19,500
We've built our own digital
literacy program, but I know
120
00:07:19,500 --> 00:07:23,000
others have been doing this and,
you know, there is one company
121
00:07:23,500 --> 00:07:25,300
that has been doing some good
work.
122
00:07:25,500 --> 00:07:30,600
We want to ensure that any
patient anyone who comes through
123
00:07:30,600 --> 00:07:34,600
our hospital system.
They get the digital literacy
124
00:07:34,600 --> 00:07:38,000
training that many we can't
expect our assume lets people
125
00:07:38,000 --> 00:07:42,700
know how to log on and sign up
for a username and password.
126
00:07:42,700 --> 00:07:45,500
Now we got to train them, we got
to have to educate and provide
127
00:07:45,500 --> 00:07:48,100
support.
So even at that Elemental level,
128
00:07:48,400 --> 00:07:52,800
that's where we can help to
demystify and destigmatize data,
129
00:07:52,800 --> 00:07:56,700
you know, data and Technology as
well and how can we make it more
130
00:07:56,700 --> 00:07:59,200
private?
I think regards to digital
131
00:07:59,200 --> 00:08:01,100
twins.
That's all we want to focus on
132
00:08:01,100 --> 00:08:03,500
Precision medicine and And I
want it too much.
133
00:08:03,500 --> 00:08:04,900
I don't want to take my hat too
much.
134
00:08:04,900 --> 00:08:08,700
But what we're trying to build a
roadmap of how we can make
135
00:08:08,800 --> 00:08:11,600
Healthcare more personalized
through the use of digital
136
00:08:11,600 --> 00:08:16,800
Solutions.
I can't wait I can't wait either
137
00:08:17,300 --> 00:08:20,800
I'm so looking forward to that
because I'll tell you my
138
00:08:20,800 --> 00:08:24,600
grandparents are still living
and they have navigating all
139
00:08:24,600 --> 00:08:30,300
these patient portals for every
single doctor I know like oh my
140
00:08:30,300 --> 00:08:32,500
gosh and 40 year olds that's
hard.
141
00:08:33,200 --> 00:08:35,799
It's hard for me.
Well, that's was about to say, I
142
00:08:35,799 --> 00:08:37,600
was like mean, it's tough for
anyone?
143
00:08:37,799 --> 00:08:41,299
Yeah, because we lead busy
lives, like, who has the time
144
00:08:41,299 --> 00:08:44,300
like, people have to realize,
and I said this all the time.
145
00:08:44,300 --> 00:08:46,500
I said it's to our team,
especially, when we're
146
00:08:46,500 --> 00:08:50,400
recruiting, participants to
studies or seen patients, like
147
00:08:50,400 --> 00:08:53,200
no one wakes up.
Thank saying I want to be part
148
00:08:53,200 --> 00:08:56,800
of a research study so how can
we make it inviting?
149
00:08:57,200 --> 00:09:01,100
But how can we make sure people
realize that they are the most
150
00:09:01,100 --> 00:09:03,900
important?
A lot of research studies
151
00:09:03,900 --> 00:09:05,800
without them.
There is no Legacy.
152
00:09:05,900 --> 00:09:08,600
There are no findings.
They are literally helping to
153
00:09:08,600 --> 00:09:13,200
change the trajectory of our
histories by participate in my
154
00:09:13,200 --> 00:09:15,300
studies.
And it's not hyperbolic, but
155
00:09:15,300 --> 00:09:18,300
this is what we believe and
that's what we try and share.
156
00:09:18,300 --> 00:09:21,100
And that's what we not, try, but
we share with participants.
157
00:09:21,700 --> 00:09:26,000
Like they are literally changing
the trajectory of our histories
158
00:09:26,000 --> 00:09:29,900
by participating because their
data will help us to understand
159
00:09:29,900 --> 00:09:32,400
how to cure cancer.
So no help.
160
00:09:32,600 --> 00:09:36,800
How to cure heart disease to
help to cure demential, how to
161
00:09:36,800 --> 00:09:40,900
tackle mental in?
Exactly, yeah.
162
00:09:42,000 --> 00:09:46,500
Yeah, so we got to have to make
it community and a partnership
163
00:09:46,600 --> 00:09:49,900
and I know others are doing
excellent work as I said we're
164
00:09:49,900 --> 00:09:53,300
not the only ones doing this.
I think what we are very proud
165
00:09:53,300 --> 00:09:56,600
of this whole comprehensive, we
want to do it.
166
00:09:56,700 --> 00:09:59,800
It stopped our team works
extremely hard because it's a
167
00:09:59,800 --> 00:10:03,800
very ambitious vision and plan
but it's not a plan that we
168
00:10:03,800 --> 00:10:06,600
know.
Or we believe that only the mill
169
00:10:06,800 --> 00:10:09,300
can actually satisfy an
accomplished.
170
00:10:09,500 --> 00:10:12,400
We need an army.
We Need a village.
171
00:10:12,400 --> 00:10:15,700
We need an entire ecosystem.
Everyone needs to get their
172
00:10:15,700 --> 00:10:19,700
proverbial hands dirty.
Yes, agreed.
173
00:10:20,100 --> 00:10:23,300
Oh my goodness.
All right, we're coming up on
174
00:10:23,300 --> 00:10:26,100
the end of the episode means you
have to come back on the show.
175
00:10:26,100 --> 00:10:28,100
I'm right.
Well you know where to reach me
176
00:10:28,100 --> 00:10:29,600
holly?
Yes absolutely.
177
00:10:29,600 --> 00:10:33,200
So final questions.
Is there anything else you'd
178
00:10:33,200 --> 00:10:34,800
like to share anything that we
missed?
179
00:10:34,800 --> 00:10:38,400
And then I want to know, what
does the future of Ed Tech?
180
00:10:38,400 --> 00:10:40,200
Look like tell us anything.
We missed.
181
00:10:40,500 --> 00:10:43,000
Yeah.
So I'll just take the laptop I
182
00:10:43,000 --> 00:10:47,300
think the future when I discuss
the future I'll talk about some
183
00:10:47,300 --> 00:10:51,300
other things that were currently
doing so or group not just our
184
00:10:51,300 --> 00:10:56,000
group, but University of Miami's
part of a large group of schools
185
00:10:56,000 --> 00:11:00,600
throughout the entire United
States that are leading this
186
00:11:00,600 --> 00:11:03,100
fantastic initiative.
That the National Institutes of
187
00:11:03,100 --> 00:11:05,700
Health has pushed its called.
Am ahead.
188
00:11:06,600 --> 00:11:12,000
The goal of a my head is to
tackle Health disparities and to
189
00:11:12,000 --> 00:11:18,100
achieve Health Equity and it is
doing so by way of including and
190
00:11:18,100 --> 00:11:21,700
seeing how they I machine
learning Big Data.
191
00:11:21,800 --> 00:11:25,500
Can help us to better understand
Health disparities, and to find
192
00:11:25,500 --> 00:11:27,400
solutions that can achieve
Equity.
193
00:11:27,900 --> 00:11:32,200
What we have done is that we
have started out trying to build
194
00:11:32,300 --> 00:11:36,000
a regional Network.
It just in Florida, that doesn't
195
00:11:36,000 --> 00:11:39,300
mean that we're only focus on
Florida but this is open to the
196
00:11:39,300 --> 00:11:40,900
entire United States, by the
way.
197
00:11:42,000 --> 00:11:44,600
But what we are doing here at
the University of Miami's that
198
00:11:44,600 --> 00:11:48,000
we have partnered with few
institutions that are minority
199
00:11:48,000 --> 00:11:51,600
serving institutions, hbcus
historically, black college and
200
00:11:51,600 --> 00:11:53,600
university.
First, it is Hispanic serving
201
00:11:53,600 --> 00:11:55,800
institutions and tribal
colleges.
202
00:11:56,300 --> 00:11:58,100
And essentially, our vision is
this.
203
00:11:58,600 --> 00:12:04,200
If we want representative data
to, for research purposes, and
204
00:12:04,200 --> 00:12:08,200
to improve our clinical
operations that large academic
205
00:12:08,200 --> 00:12:11,400
institutions are not necessarily
all the time.
206
00:12:11,400 --> 00:12:14,400
At the front lines of tackling
Health, disparities for a wide,
207
00:12:14,400 --> 00:12:16,600
variety of different reasons.
I don't want to get in trouble.
208
00:12:17,500 --> 00:12:22,300
The quantity is the point is the
folks who are at the front Lines
209
00:12:22,600 --> 00:12:26,400
are community-based clinics
federally qualified Health
210
00:12:26,400 --> 00:12:31,000
Centers, urgent, cares.
And many of these places are in
211
00:12:31,000 --> 00:12:34,300
impoverished under resource
locations.
212
00:12:34,600 --> 00:12:38,700
So, here's the vision that we we
said, what if we partner with an
213
00:12:38,700 --> 00:12:42,600
academic institution that's in
those communities, where Health
214
00:12:42,600 --> 00:12:46,700
disparities are high, we build
up the research AR machine
215
00:12:46,700 --> 00:12:50,000
learning infrastructure of those
institutions either be in terms
216
00:12:50,000 --> 00:12:53,300
of research or clinical care.
And Like and then in a
217
00:12:53,300 --> 00:12:58,500
hub-and-spoke model they then
get to train and he'll the local
218
00:12:58,500 --> 00:13:01,500
clinics.
So here is it know that the
219
00:13:01,500 --> 00:13:04,100
reason why people come to our
University of Miami or a
220
00:13:04,100 --> 00:13:10,000
Harvard, or an NYU, or a USC is
because of all the superb talent
221
00:13:10,000 --> 00:13:13,700
that they have in terms of
clinicians and the like, which
222
00:13:13,700 --> 00:13:15,900
is just the infrastructure that
you can do.
223
00:13:15,900 --> 00:13:18,500
Several Imaging on it, you can
get the results in no time.
224
00:13:18,500 --> 00:13:19,900
You can do blood work on the
results.
225
00:13:19,900 --> 00:13:22,500
Come back in no time.
You don't have Have that in
226
00:13:22,500 --> 00:13:26,300
local community-based clinics
and that's what in many ways,
227
00:13:26,300 --> 00:13:27,800
from a Health Service
perspective.
228
00:13:27,800 --> 00:13:29,800
That's what drives Health
disparities.
229
00:13:30,200 --> 00:13:34,100
So what if we build
infrastructures eliminating
230
00:13:34,100 --> 00:13:38,900
excess issues, eliminating
smallness in processes in these
231
00:13:38,900 --> 00:13:44,500
settings whereby we can provide
may be as good if not better
232
00:13:44,500 --> 00:13:47,700
because these smaller places
might be more.
233
00:13:47,700 --> 00:13:51,500
Well resource with, I did
infrastructure and they can be a
234
00:13:51,500 --> 00:13:55,200
little More Nimble than these
big ship like types of brick and
235
00:13:55,200 --> 00:13:58,300
mortar academic Health Centers.
That's what we're building.
236
00:13:58,300 --> 00:14:02,200
And so that's just one method in
which you need to stop the
237
00:14:02,200 --> 00:14:04,600
talking that Health.
We need more trying to solve
238
00:14:04,600 --> 00:14:07,300
Health disparities.
What we're trying to do is,
239
00:14:07,300 --> 00:14:09,900
we're all about that action.
I know that someone's very
240
00:14:09,900 --> 00:14:13,100
vernacular, what we're all about
that action in many ways.
241
00:14:13,400 --> 00:14:17,400
That's what we're trying to do.
And we're trying to make this a
242
00:14:17,400 --> 00:14:21,600
pathway and a national model as
whole to do that.
243
00:14:21,800 --> 00:14:25,400
And and so that's how we're
committed from a future
244
00:14:25,400 --> 00:14:30,700
standpoint to tackle technology,
the digital divide an exclusion.
245
00:14:30,700 --> 00:14:32,900
And these are some ways in which
we are doing.
246
00:14:32,900 --> 00:14:35,300
And just through this program,
you know, and your other
247
00:14:35,300 --> 00:14:38,800
programs that we're currently
doing that emanates and is a
248
00:14:38,800 --> 00:14:41,300
tributary of this main program
as well.
249
00:14:42,300 --> 00:14:45,100
Fantastic, I'm really looking
forward to the future.
250
00:14:46,000 --> 00:14:48,400
Yeah, after talking to you, I'm
looking forward to the future.
251
00:14:50,200 --> 00:14:52,900
Really excited about it.
Really, really excited, you
252
00:14:52,900 --> 00:14:56,000
know, you know.
So, I must be honest with you.
253
00:14:56,000 --> 00:15:00,500
I'm gonna try and avoid the chat
GPT conversation and I will say
254
00:15:00,500 --> 00:15:03,900
this wonderful thing when we
were teaching, we had a class
255
00:15:03,900 --> 00:15:07,500
the other day at one of the
local hbcus and we were having a
256
00:15:07,500 --> 00:15:11,200
conversation around because, you
know, I think many schools are
257
00:15:11,200 --> 00:15:15,900
apoplectic about the inclusion
of chat GPT, you know, in
258
00:15:15,900 --> 00:15:18,800
schools because they're like
well students won't get to
259
00:15:18,800 --> 00:15:20,800
learn.
I was having this discussion
260
00:15:20,800 --> 00:15:23,800
with my mom.
Mom, who is also a professor and
261
00:15:23,800 --> 00:15:29,000
she teaches a group of students
who may be classified as you
262
00:15:29,000 --> 00:15:33,100
know, historically
underrepresented in stem and the
263
00:15:33,100 --> 00:15:35,600
life.
I'm of the mindset that inasmuch
264
00:15:35,600 --> 00:15:39,800
as chat GPT is not a Panacea for
curing all but it's gonna be
265
00:15:39,800 --> 00:15:42,700
here to stay.
Absolutely.
266
00:15:43,200 --> 00:15:46,900
And what do we do?
How can this revolutionize
267
00:15:47,200 --> 00:15:50,900
education?
I liken it to rather to how many
268
00:15:50,900 --> 00:15:54,200
of us in education, Asian
adopted, this new pedagogical
269
00:15:54,200 --> 00:16:00,200
framework called class flipping
or a flipped classroom where the
270
00:16:00,200 --> 00:16:03,600
onus was on the students to come
prepared to read.
271
00:16:03,600 --> 00:16:07,400
And then we can discuss.
And the point that I'm making is
272
00:16:07,400 --> 00:16:12,100
this, that many people say,
well, if you use chat GPT
273
00:16:12,100 --> 00:16:14,900
students, won't know how to
think, and they won't be able to
274
00:16:14,900 --> 00:16:17,600
write.
And all of that, we see the type
275
00:16:17,600 --> 00:16:19,400
of students that are Left
Behind.
276
00:16:19,408 --> 00:16:22,800
These are the students who
traditionally Additionally are
277
00:16:22,800 --> 00:16:25,500
going to school either
part-time, and if you're going
278
00:16:25,500 --> 00:16:28,400
to school school, full time,
they're also working full-time.
279
00:16:28,500 --> 00:16:31,800
And so, they may also have a
wide variety of different social
280
00:16:31,800 --> 00:16:33,200
demand.
So what do we do for these
281
00:16:33,200 --> 00:16:36,000
folks?
Do we say this one-size-fits-all
282
00:16:36,000 --> 00:16:39,700
Warehouse educational program
must work for them.
283
00:16:39,700 --> 00:16:43,300
This is why so many of our
students fall behind but what if
284
00:16:43,300 --> 00:16:46,800
we use AI machine learning, not
just singling out chat, G, PT,
285
00:16:46,800 --> 00:16:49,100
or barred or any of these large
language models.
286
00:16:49,100 --> 00:16:51,600
Because they have they have
their issues, right?
287
00:16:51,900 --> 00:16:58,200
Particularly around this concept
of hallucinations or stochastic
288
00:16:58,200 --> 00:17:00,500
parroting.
I believe stochastic parroting
289
00:17:00,500 --> 00:17:05,300
where it will just make stuff up
and people like to see it's no
290
00:17:05,300 --> 00:17:09,000
good no it's just been released,
it needs to get better.
291
00:17:09,099 --> 00:17:11,900
So I reckon I want to your
audience to know that it has its
292
00:17:11,900 --> 00:17:14,800
limitations or what people call
hallucinations, right?
293
00:17:14,800 --> 00:17:17,700
Or what people call
confabulations where it tries to
294
00:17:17,700 --> 00:17:20,400
fill in gaps and it will make
stuff up.
295
00:17:20,500 --> 00:17:24,599
It's no different from Someone
who has a cognitive impairment
296
00:17:24,599 --> 00:17:28,300
on a track to fill in the gaps
when your gas and that's what AI
297
00:17:28,300 --> 00:17:30,800
has been built to, it's a
predictive model, right?
298
00:17:30,800 --> 00:17:35,300
So it has its issue.
The problem though is that if we
299
00:17:35,300 --> 00:17:39,400
tell a group of students that we
cannot use this, while the
300
00:17:39,400 --> 00:17:43,700
houses on the have mores, they
have already benefited from
301
00:17:43,700 --> 00:17:47,300
years of privilege.
And the one thing that could
302
00:17:47,300 --> 00:17:51,500
help to equalize the footing,
these students can't use it.
303
00:17:51,800 --> 00:17:54,900
I have an issue with that, so
there's needs to be some ethics,
304
00:17:54,900 --> 00:17:58,100
our own data as to when it can
be used, but one thing's are
305
00:17:58,100 --> 00:18:01,000
sharing with.
My mom is what if we use chat
306
00:18:01,000 --> 00:18:05,100
GPT or any large language model
as a way of helping kids to
307
00:18:05,100 --> 00:18:08,500
think critically to build an
organized thought process
308
00:18:09,100 --> 00:18:13,000
because anyone who liked it for
exactly saves time.
309
00:18:13,100 --> 00:18:15,900
Exactly.
But don't you feel like Holly,
310
00:18:15,900 --> 00:18:19,100
don't you feel like it has
forced you to ask better
311
00:18:19,100 --> 00:18:22,100
questions.
Yeah it's right because You know
312
00:18:22,100 --> 00:18:24,500
exactly.
Okay, this is where I want to go
313
00:18:25,000 --> 00:18:28,200
but how do I get there?
You asked this question?
314
00:18:28,200 --> 00:18:31,900
And then that question leads to
another question and here is it
315
00:18:31,900 --> 00:18:34,300
your research?
Exactly.
316
00:18:34,500 --> 00:18:38,200
That's what I'm saying.
What if this we as Educators, we
317
00:18:38,200 --> 00:18:42,300
use this as an opportunity to
help students how to think,
318
00:18:42,300 --> 00:18:45,300
critically and to develop
arguments because you know, what
319
00:18:45,300 --> 00:18:48,700
happens, what happened before
this, you would tell students to
320
00:18:48,700 --> 00:18:52,400
go do a research and students
will spend Was a pun.
321
00:18:52,700 --> 00:18:56,000
Oh, was looking.
I'm going down this rabbit holes
322
00:18:56,000 --> 00:18:58,400
to find information when they
come back.
323
00:18:58,600 --> 00:19:02,200
It's not digested, it's not
synergist, it's not synergized.
324
00:19:02,800 --> 00:19:07,200
So then, what if we can kind of
speed up and accelerate the
325
00:19:07,200 --> 00:19:11,500
process of the search and focus
on the critical thinking on how
326
00:19:11,500 --> 00:19:15,400
to develop persuasive argument
because in many ways, what will
327
00:19:15,400 --> 00:19:17,200
happen?
And I've said this previously,
328
00:19:17,500 --> 00:19:21,600
the type of students that we
will likely have based on
329
00:19:21,700 --> 00:19:24,200
technology.
These are students who have to
330
00:19:24,200 --> 00:19:28,700
be what we call information
manipulators because we have a
331
00:19:28,700 --> 00:19:33,000
bevy of information so it's not
like we're producing new
332
00:19:33,000 --> 00:19:37,100
knowledge every day but what
could help to facilitate
333
00:19:37,100 --> 00:19:41,100
students to provide and to
create new knowledge is if we
334
00:19:41,100 --> 00:19:45,400
can accelerate the information
manipulation process so that
335
00:19:45,400 --> 00:19:48,100
they can learn the theory so
that they can go ahead and
336
00:19:48,100 --> 00:19:51,500
create an Innovative.
That is the full Focus up.
337
00:19:51,700 --> 00:19:55,600
The media, Innovation lab
because previously, we've taught
338
00:19:55,600 --> 00:19:59,000
kids, how to do research.
We are just collecting a whole
339
00:19:59,000 --> 00:20:02,000
bunch of stuff and put it
together and spit out.
340
00:20:02,000 --> 00:20:05,200
Some kind of regurgitated,
version of whatever they just
341
00:20:05,200 --> 00:20:08,000
looked up.
What if we provide it in a more
342
00:20:08,000 --> 00:20:10,400
synergized way?
And that's kind of what we're
343
00:20:10,400 --> 00:20:12,700
thinking that Innovation needs
to be.
344
00:20:12,900 --> 00:20:17,300
It all called point of all
education from elementary all
345
00:20:17,300 --> 00:20:21,200
the way to professional.
Love it, and that's going to
346
00:20:21,200 --> 00:20:22,700
happen.
It's going to have to happen.
347
00:20:22,900 --> 00:20:24,700
It's going to be nice to.
Yes.
348
00:20:25,000 --> 00:20:30,500
Absolutely has to absolutely
absolutely through a lot here
349
00:20:31,100 --> 00:20:34,100
and I can't wait to share this
episode with the audience and
350
00:20:34,100 --> 00:20:36,500
also, I will put all the
information shared here in the
351
00:20:36,500 --> 00:20:39,300
show notes.
So you can go grab information
352
00:20:39,300 --> 00:20:43,000
about The Innovation lab, and
doctors say she has and his
353
00:20:43,000 --> 00:20:44,600
team.
And what they're all doing.
354
00:20:44,700 --> 00:20:46,700
I can't thank you enough for
coming on the show.
355
00:20:46,700 --> 00:20:51,000
It's been fantastic.
Holly this was the most fun I've
356
00:20:51,000 --> 00:20:54,400
had in a long time.
So thanks to you and your
357
00:20:54,400 --> 00:20:56,900
audience for listening to us and
follow us.
358
00:20:56,900 --> 00:20:59,500
We are across a wide.
Variety of different social
359
00:20:59,500 --> 00:21:03,800
media from LinkedIn to Instagram
Twitter, the mill.
360
00:21:04,100 --> 00:21:08,600
Hashtag the mill.
Th e m, IL please find us
361
00:21:08,600 --> 00:21:13,200
followers.
Let's create an engineer, the
362
00:21:13,200 --> 00:21:17,000
world that we seek excellent.
Well, thanks again for coming on
363
00:21:17,000 --> 00:21:19,600
and all that information.
Again, will be Be in the show
364
00:21:19,600 --> 00:21:25,800
notes, you've just experienced
an another amazing episode of Ed
365
00:21:25,800 --> 00:21:27,000
up.
Ed Tech.
366
00:21:27,500 --> 00:21:33,000
Be sure to visit our website at
edip edtech.com to get all the
367
00:21:33,000 --> 00:21:35,700
updates on the latest edtech
happening.
368
00:21:36,500 --> 00:21:37,800
See you next time.