Can AI Help Make Higher Education More Equitable?
AI and Equitable Education...we hear alot about the harm that generative AI can cause on campuses, but can it also heal and improve equitability for higher ed? Join host KiKi L'Italien for this episode featuring insights from Charles Ansell of...
AI and Equitable Education...we hear alot about the harm that generative AI can cause on campuses, but can it also heal and improve equitability for higher ed? Join host KiKi L'Italien for this episode featuring insights from Charles Ansell of Complete College America, Michael Baston from Cuyahoga Community College, and Audrey Ellis, the visionary behind T3 Advisory.
This discussion zeroes in on the role of Generative AI in creating a more equitable education landscape within higher education.
Tapping into the wisdom of the "Leveraging AI to Increase College Completion and Equity" playbook, our guests shed light on how AI technologies are not only reshaping the academic experience but also leveling the playing field for students from all walks of life. This episode explores how AI fosters an inclusive environment, enhances student success, and bridges longstanding gaps in educational equity.
Moreover, we'll tackle the significant policy conversations, showcase practical AI applications in academia, and forecast how AI continues to sculpt a future where equal opportunities in education are not just a goal, but a reality.
Tune in to uncover the pivotal role Generative AI plays in championing equitable education, marking a step forward in ensuring every student has the chance to succeed. Discover why AI in higher education is not merely a leap in technology but a leap towards hope and equal opportunities for all.
- Questions discussed
00:00:06,779 --> 00:00:09,989
KiKi L'Italien: Welcome to
Association Chat your online
2
00:00:09,989 --> 00:00:13,079
discussion where we warm
ourselves by the virtual fire
3
00:00:13,079 --> 00:00:15,809
with topics of the day,
welcoming thought leaders and
4
00:00:15,809 --> 00:00:19,259
trailblazers alike to join up in
this online home for the
5
00:00:19,259 --> 00:00:23,249
community. I'm the host of
Association Chat KiKi L'Italien.
6
00:00:23,669 --> 00:00:29,549
And could AI revolutionize
student success? Could it foster
7
00:00:29,759 --> 00:00:35,849
inclusivity? could bridge gaps
and educational equity? We hear
8
00:00:35,849 --> 00:00:39,449
a lot about plagiarism. We hear
a lot about the scary stuff. But
9
00:00:39,449 --> 00:00:42,809
what about the good stuff? Is it
possible that AI could be our
10
00:00:42,809 --> 00:00:46,349
friend? Let's talk about it. I
have some special guests with me
11
00:00:46,349 --> 00:00:50,339
here today to help answer some
of these questions. And so
12
00:00:50,429 --> 00:00:54,149
helping me to answer that
question, Charles Ansel Michael,
13
00:00:54,149 --> 00:00:58,649
Bastien, and we have Audrey
Ellis with us here today. Hi,
14
00:00:58,649 --> 00:01:03,149
welcome, everyone. Hi, thanks
for having us. All right, thank
15
00:01:03,149 --> 00:01:05,819
you for being here. I want to
dive right into it so that we
16
00:01:05,819 --> 00:01:09,869
have a chance to get into this
discussion. You know, ay ay ay
17
00:01:09,869 --> 00:01:15,569
ay ay ay. It's every where you
can't avoid it and the news and
18
00:01:15,599 --> 00:01:20,519
the headlines. And one place
where you start to hear about
19
00:01:20,519 --> 00:01:24,299
complications almost
exclusively, is in higher ed,
20
00:01:24,299 --> 00:01:27,959
people are talking about not
trusting the work that's being
21
00:01:27,959 --> 00:01:31,709
done, they're really dwelling on
that in the conversations that
22
00:01:31,739 --> 00:01:36,959
I'm a party to on a regular
basis. But I think what's really
23
00:01:36,959 --> 00:01:40,109
interesting is that the work
that you all have been doing
24
00:01:40,109 --> 00:01:43,889
together is looking at how it
might actually be something that
25
00:01:43,889 --> 00:01:48,119
is a positive. And I thought,
you know, let's explore it. So
26
00:01:48,119 --> 00:01:52,139
I'm gonna start with you,
Charles, if you don't mind, I
27
00:01:52,139 --> 00:01:58,019
want to ask you about how you
see generative AI impacting
28
00:01:58,049 --> 00:02:02,999
policy and advocacy in higher
ed, especially in terms of
29
00:02:02,999 --> 00:02:05,279
equity and completion rates.
30
00:02:06,330 --> 00:02:11,340
Charles Ansell: Yeah, thanks for
that question. And, you know,
31
00:02:11,340 --> 00:02:13,770
I'll, I'll try to be brief, but
it's actually really hard,
32
00:02:13,770 --> 00:02:16,020
because there's a real lot to
unpack there. Because I think
33
00:02:16,020 --> 00:02:18,930
that very often, when we think
about AI, we think immediately,
34
00:02:19,440 --> 00:02:23,190
you know, oh, wow, this is such
a powerful, new way that we can
35
00:02:23,220 --> 00:02:27,090
do business and get work done.
But unless we really think
36
00:02:27,120 --> 00:02:30,060
problem first about what this
sort of technology is supposed
37
00:02:30,060 --> 00:02:32,940
to solve, we're never going to
use it right. And so, you know,
38
00:02:32,940 --> 00:02:36,210
in that question, you know, how
are we going to use AI to help
39
00:02:36,210 --> 00:02:39,750
with policy and advocacy,
particularly for student
40
00:02:40,110 --> 00:02:44,070
success? And for equitable
outcomes? In post secondary, I
41
00:02:44,070 --> 00:02:46,830
think that we need to like, zoom
back a little bit, right, we
42
00:02:46,830 --> 00:02:49,620
have to start with the
acknowledgement of where higher
43
00:02:49,650 --> 00:02:53,280
education is at where the sector
is at in terms of equity and in
44
00:02:53,280 --> 00:02:56,460
terms of completion rates. And
then what are the policy and
45
00:02:56,460 --> 00:03:00,420
advocacy levers that we're
supposed to, you know, use to
46
00:03:00,420 --> 00:03:03,870
ameliorate the situation? And
really only then how, how can
47
00:03:03,870 --> 00:03:06,420
generative AI help at least
that's how we see that Complete
48
00:03:06,420 --> 00:03:09,030
College America. So just really
quick, running that through.
49
00:03:10,290 --> 00:03:13,560
Higher Ed, in my opinion, is not
really in a great place, right.
50
00:03:13,590 --> 00:03:16,080
And I don't think a lot of
people really know this outside
51
00:03:16,080 --> 00:03:19,320
of the sector. And it does,
indeed need transformation. So
52
00:03:19,590 --> 00:03:22,200
we'll start with some basic
facts, right. So for full time
53
00:03:22,200 --> 00:03:24,810
students attending public four
year colleges and universities,
54
00:03:25,080 --> 00:03:28,230
the number of full time students
who graduate in four years from
55
00:03:28,230 --> 00:03:31,770
four year colleges is 40%. And
that's a number that's buoyed by
56
00:03:31,770 --> 00:03:34,950
well resourced public flagship
universities. For two year
57
00:03:34,950 --> 00:03:37,530
colleges, the two year
graduation rate for full time
58
00:03:37,530 --> 00:03:41,190
students is about 18%. So
that's, you know, less than one
59
00:03:41,190 --> 00:03:44,070
in five. And it's roughly the
same for part time students at
60
00:03:44,070 --> 00:03:47,430
community colleges who graduate
in six years. So these numbers
61
00:03:47,430 --> 00:03:50,610
are lower for students who come
from less resource communities
62
00:03:50,670 --> 00:03:53,220
for students of color. And it's
especially true because
63
00:03:53,910 --> 00:03:57,600
colleges, you know, this, you
know, it's a struggle to work
64
00:03:57,600 --> 00:04:00,120
with working learners, the most
students who work more than 20
65
00:04:00,120 --> 00:04:04,650
hours per week, rarely graduate,
graduate, and when they do, you
66
00:04:04,650 --> 00:04:09,690
know, they, they get saddled
with debt. So the students who
67
00:04:09,690 --> 00:04:13,200
do go to college, the students
who most need to be the
68
00:04:13,200 --> 00:04:15,660
beneficiaries of that
transformative social mobility
69
00:04:15,660 --> 00:04:18,000
that that college can offer,
these are the students who are
70
00:04:18,150 --> 00:04:20,610
most likely not to graduate and
the way the current system is
71
00:04:20,610 --> 00:04:24,900
set up. So you know, this is
happening, of course, at a time
72
00:04:24,900 --> 00:04:28,500
when a college degree has never
had a higher premium. So not
73
00:04:28,500 --> 00:04:31,560
only do most jobs require a
degree, but almost all good
74
00:04:31,560 --> 00:04:35,400
careers do. And by 2031
Georgetown estimates, I want to
75
00:04:35,400 --> 00:04:39,630
say it's 72%, or about three and
four US residents need to have a
76
00:04:39,630 --> 00:04:43,140
post secondary credential of
some sort to meet our workforce
77
00:04:43,140 --> 00:04:46,440
needs optimally. And yet in the
last few decades, you know, what
78
00:04:46,440 --> 00:04:50,730
we've set up is, you know, a
financial system, where the Pell
79
00:04:50,730 --> 00:04:53,190
Grant only has a third of its
original purchasing power in
80
00:04:53,190 --> 00:04:55,710
tuition and student loan debt
has skyrocketed. So in other
81
00:04:55,710 --> 00:04:58,320
words, hold the same Yeah, hold
82
00:04:58,350 --> 00:05:01,500
KiKi L'Italien: up like just
saying that They think, oh my
83
00:05:01,500 --> 00:05:04,680
gosh, I had no idea. First of
all, I had no idea that that was
84
00:05:04,680 --> 00:05:08,040
the case. But that's some.
That's scary. That's really I
85
00:05:08,040 --> 00:05:10,650
mean, that's something that
people really rely on to be able
86
00:05:10,650 --> 00:05:14,670
to go. And just sorry, I was
just distracted by that was
87
00:05:15,630 --> 00:05:19,860
pretty crazy statistic added to
all the other. Well,
88
00:05:21,209 --> 00:05:23,639
Charles Ansell: yeah, and I
think that the big problem,
89
00:05:23,639 --> 00:05:26,279
right is I mean, that's a
problem in itself. But at the
90
00:05:26,279 --> 00:05:29,639
same time, we're also putting
forward this message that's very
91
00:05:29,639 --> 00:05:33,929
accurate and true, that higher
education, you need a credential
92
00:05:33,929 --> 00:05:38,219
now more than ever. So the only
thing worse than being able to
93
00:05:39,509 --> 00:05:42,929
struggle to afford it while
attending, which is something
94
00:05:42,929 --> 00:05:46,229
that, you know, causes student
attrition in a very large
95
00:05:46,229 --> 00:05:49,109
amount, but the only thing worse
is being shut out of the economy
96
00:05:49,109 --> 00:05:51,179
because you don't have the
degree or certificate to be able
97
00:05:51,179 --> 00:05:53,759
to participate in the first
place. And so like, this is
98
00:05:53,759 --> 00:05:57,779
where we get to the those policy
and advocacy levers for
99
00:05:57,779 --> 00:06:01,589
equitable student success, you
know, and, and I, I know, this
100
00:06:01,589 --> 00:06:05,609
is a long lineup, but I am going
to bring it to AI. But I think
101
00:06:05,759 --> 00:06:09,269
you can boil that down to three
things, right? I think first, we
102
00:06:09,269 --> 00:06:11,429
need to get out of our own way.
And we need to get rid of bad
103
00:06:11,429 --> 00:06:14,159
practices and bureaucracy that
prevents students from having
104
00:06:14,159 --> 00:06:17,999
purpose in their major structure
to their academic experience,
105
00:06:17,999 --> 00:06:20,879
momentum to timely completion,
and really having an enough
106
00:06:20,879 --> 00:06:25,109
support to complete college on
time. And, you know, we'll we'll
107
00:06:25,109 --> 00:06:28,079
hear from, you know, a college
president who's doing exactly
108
00:06:28,079 --> 00:06:34,139
those things. Second, we need
targeted interventions to
109
00:06:34,139 --> 00:06:36,959
support students equitably. So
not every student needs the same
110
00:06:36,959 --> 00:06:39,419
thing, right, like, that's the
whole premise. Some students may
111
00:06:39,419 --> 00:06:42,839
need relatively shallow support
some deep, some need different
112
00:06:42,839 --> 00:06:44,969
individualized attention,
depending on what their needs
113
00:06:44,969 --> 00:06:49,799
are. And then the third is we
can't just efficiency our way
114
00:06:49,799 --> 00:06:53,579
out of this, right. There's been
a lot of studies that have gone
115
00:06:53,579 --> 00:06:57,149
on in high schools to show that,
you know, you add a lot of high
116
00:06:57,149 --> 00:07:01,439
touch interventions in a under
resourced school, you can see
117
00:07:01,439 --> 00:07:04,919
incremental gains in high school
completion rates, but it's never
118
00:07:04,919 --> 00:07:07,589
going to be as strong as that,
well, resource high schools,
119
00:07:07,589 --> 00:07:10,139
it's the same thing. In
colleges, students success
120
00:07:10,139 --> 00:07:12,329
interventions can create
incremental change. We've seen
121
00:07:12,329 --> 00:07:14,459
it in the last decade in the
college completion movement, but
122
00:07:14,459 --> 00:07:17,669
we'll hit a ceiling soon. And we
have to fund higher education
123
00:07:17,669 --> 00:07:20,459
like we care about it. And so
that's where generative AI comes
124
00:07:20,459 --> 00:07:22,679
in, not just on that funding
piece, of course, but
125
00:07:22,679 --> 00:07:25,049
essentially on those three
areas, right. So like, first, we
126
00:07:25,049 --> 00:07:28,949
have to have a stance towards
generative AI as something that
127
00:07:28,949 --> 00:07:32,399
is a power for good, not just a
power for plagiarism, like to
128
00:07:32,399 --> 00:07:37,049
your point about all the
controversy in AI around higher
129
00:07:37,049 --> 00:07:40,229
education, you know, it's been
so much of the attention has
130
00:07:40,229 --> 00:07:43,319
been on this negative side of
things, but we have to dive
131
00:07:43,919 --> 00:07:47,009
headlong into its promises. The
second is that we have to use it
132
00:07:47,009 --> 00:07:49,679
to solve the problems that we
see in student success, you
133
00:07:49,679 --> 00:07:52,649
know, talking about having
equitable interventions, the
134
00:07:52,649 --> 00:07:54,749
need to innovate and schedules
and credit for prior learning
135
00:07:54,749 --> 00:07:56,969
for part time students need to
get all students on semester by
136
00:07:56,969 --> 00:08:00,329
semester academic plans, and
really liberate advising
137
00:08:00,329 --> 00:08:03,599
functions, you know, where
counselors meet with students,
138
00:08:03,659 --> 00:08:06,449
so much of that is manual
transactional conversations on
139
00:08:06,449 --> 00:08:09,449
like, well, what course at what
time, and there's not enough
140
00:08:09,449 --> 00:08:12,419
time to have those rich
conversations where advisors are
141
00:08:12,419 --> 00:08:15,839
able to give actual advice. And
finally, and I would say, most
142
00:08:15,839 --> 00:08:19,499
importantly, we need to make
sure generative AI isn't the
143
00:08:19,499 --> 00:08:22,769
sole purview of rich colleges,
right, the MIT's the Stanford's
144
00:08:22,769 --> 00:08:25,949
the pens, the colleges, whose
presidents are important enough
145
00:08:25,949 --> 00:08:28,919
to get yelled at by Congress, we
need to make sure that community
146
00:08:28,919 --> 00:08:32,309
colleges, rural colleges, HBCUs,
PDAs, and other minority serving
147
00:08:32,309 --> 00:08:35,939
institutions aren't working off
hardware infrastructure routers
148
00:08:35,939 --> 00:08:38,729
and switches from the 90s that
are woefully inadequate to the
149
00:08:38,729 --> 00:08:41,129
horsepower required to optimize
generative AI for student
150
00:08:41,129 --> 00:08:46,139
success. So, you know, I know
that was a bit of a long answer,
151
00:08:46,139 --> 00:08:49,139
but I think it's good to like at
the outset, just talk about, you
152
00:08:49,139 --> 00:08:51,119
know, the problems that we're
trying to solve, because that's
153
00:08:51,119 --> 00:08:53,309
the only way we switch
orientation or generative AI.
154
00:08:53,730 --> 00:08:55,560
KiKi L'Italien: I mean, I think
it was the perfect setup,
155
00:08:55,560 --> 00:08:58,680
because I think, you know, you
provided a really great
156
00:08:58,680 --> 00:09:00,870
background and foundation for
why we're having this
157
00:09:00,870 --> 00:09:05,580
conversation in the first place.
So thank you for doing that. You
158
00:09:05,580 --> 00:09:09,810
know, you talked about the
promise of AI and the problems
159
00:09:09,810 --> 00:09:13,470
with success for many students
why they can't be successful or
160
00:09:13,470 --> 00:09:17,550
find success and higher ed, and
then not leaving all of this to
161
00:09:17,550 --> 00:09:21,630
the rich colleges that can
afford to be left alone when
162
00:09:21,630 --> 00:09:25,050
they're when they're
experimenting, and trying some
163
00:09:25,050 --> 00:09:27,780
new things. It's fair, it sounds
scary, let's say trying new
164
00:09:27,780 --> 00:09:31,440
things. I want to actually bring
this over to someone who
165
00:09:31,440 --> 00:09:34,860
probably has very interesting
perspective to share on it as
166
00:09:34,860 --> 00:09:40,770
well. And so so let's talking to
you president Michael Basta. And
167
00:09:40,770 --> 00:09:44,430
let's talk about this, about
from the perspective of a
168
00:09:44,430 --> 00:09:48,210
community college president.
What are the practical
169
00:09:48,210 --> 00:09:52,020
challenges and the opportunities
of integrating AI into the
170
00:09:52,020 --> 00:09:54,840
curriculum? What are the
challenges that you're seeing?
171
00:09:55,350 --> 00:09:58,590
Michael Baston: Well, first and
foremost, is to desensitize the
172
00:09:58,590 --> 00:10:02,970
faculty who have to invigorate
and integrate AI into the
173
00:10:02,970 --> 00:10:07,230
curriculum, because as Charles
mentioned, the idea that some
174
00:10:07,230 --> 00:10:12,930
think about plagiarism, and that
sort of becomes the color by
175
00:10:12,930 --> 00:10:16,830
which they look at the
conversation. But the fact is,
176
00:10:16,860 --> 00:10:20,730
AI is not going anywhere, the
fact that students will actually
177
00:10:20,730 --> 00:10:25,050
have access to the technology.
And so it is not feasible to
178
00:10:25,050 --> 00:10:27,780
just simply say, Well, we're
going to put in our syllabus,
179
00:10:27,780 --> 00:10:32,070
you can't use any AI, because
they're going to use grammerly,
180
00:10:32,070 --> 00:10:35,820
which is AI, they're going to
use all of the other kinds of
181
00:10:35,820 --> 00:10:38,520
technology and tools that are
going to be helpful, and that
182
00:10:38,550 --> 00:10:41,730
actually, the faculty do want
them to actually have that
183
00:10:41,730 --> 00:10:44,340
support. So I think it's
important for educational
184
00:10:44,340 --> 00:10:47,730
institutions to resource and
provide professional development
185
00:10:47,730 --> 00:10:52,860
for faculty, so that they can
best integrate the opportunities
186
00:10:52,860 --> 00:10:56,400
with AI. And particularly at the
community college, when we focus
187
00:10:56,400 --> 00:10:59,940
heavily on thinking about the
career exploration of our
188
00:10:59,940 --> 00:11:04,440
students and the kinds of career
aspirations that they have, are
189
00:11:04,440 --> 00:11:07,440
thinking about preparing those
students to go into the world of
190
00:11:07,440 --> 00:11:11,880
work for those who will go right
into it, you want them to be
191
00:11:11,880 --> 00:11:16,770
prepared for the expectations of
employers who when they get on
192
00:11:16,800 --> 00:11:19,770
in the workplace are going to
expect them to work smarter and
193
00:11:19,770 --> 00:11:24,060
not harder to be producing at a
higher level. And so you know,
194
00:11:24,090 --> 00:11:29,370
even the basic premise of
generative at the prompt, sort
195
00:11:29,370 --> 00:11:33,000
of the ability to get students
to understand prop
196
00:11:33,030 --> 00:11:36,570
characteristics and, and how to
use that effectively, will
197
00:11:36,570 --> 00:11:40,500
actually separate those out in
the workplace who don't have
198
00:11:40,500 --> 00:11:44,010
that orientation and background
and skill set. And so, you know,
199
00:11:44,010 --> 00:11:47,670
we do a disservice to our
students, if we are not arming
200
00:11:47,670 --> 00:11:50,940
them with everything they need
to effectively navigate the
201
00:11:50,940 --> 00:11:54,720
realities of a changing
landscape, whether it's in
202
00:11:54,720 --> 00:11:58,320
education, or in the world of
work. And also listen, we have
203
00:11:58,320 --> 00:12:02,250
turned it in. So you can get
your students to actually look
204
00:12:02,250 --> 00:12:06,000
at, you know, the content of
Turnitin tells you not only
205
00:12:06,000 --> 00:12:09,810
about plagiarism, but it also
tells you about AI generated
206
00:12:09,960 --> 00:12:13,530
content. So it is not that
faculty members are completely
207
00:12:13,530 --> 00:12:19,620
devoid of any kind of ways to
ensure originality and
208
00:12:19,620 --> 00:12:23,730
authenticity of student work, it
is to show students you know,
209
00:12:23,760 --> 00:12:28,200
use this for inspiration, use
this for feedback, use this to
210
00:12:28,200 --> 00:12:31,860
help you sort of, you know, I
can't be with you 24/7, you're
211
00:12:31,860 --> 00:12:34,230
at three o'clock in the morning,
and you're doing your essay, I
212
00:12:34,230 --> 00:12:38,250
can't be there with you to do
that essay. But you can use this
213
00:12:38,250 --> 00:12:41,640
technology to help you spark
ideas, but at the end of the
214
00:12:41,640 --> 00:12:45,240
day, you're gonna get credit for
your originality, you're gonna
215
00:12:45,240 --> 00:12:47,850
get credit for your
authenticity, and we shouldn't
216
00:12:47,880 --> 00:12:51,180
act like they're not going to
have access to what they already
217
00:12:51,180 --> 00:12:52,980
have access to. Right.
218
00:12:53,010 --> 00:12:55,620
KiKi L'Italien: I mean, I
remember this back when we
219
00:12:55,620 --> 00:12:58,920
started looking at things like
social media and, and in the
220
00:12:58,920 --> 00:13:04,080
workforce, what we saw was this
desire to control by pretending
221
00:13:04,080 --> 00:13:07,800
that if we suppress this, if we
said this technology is not
222
00:13:07,800 --> 00:13:12,120
allowed to be used in the air,
what happened, you know,
223
00:13:12,120 --> 00:13:15,630
everybody's over here doing
that, you know, like, you
224
00:13:15,630 --> 00:13:19,440
couldn't suppress it, you
couldn't outline and when you're
225
00:13:19,440 --> 00:13:21,690
talking about something like
generative AI, I think the
226
00:13:21,690 --> 00:13:26,220
challenge is that, to your
point, people are focusing on on
227
00:13:26,220 --> 00:13:29,520
weaponizing it in a way that it
can be used for things like
228
00:13:29,520 --> 00:13:32,700
creating malicious code, or
breaking the, you know, breaking
229
00:13:33,000 --> 00:13:37,320
the rules by like plagiarizing
and things like that. But you
230
00:13:37,320 --> 00:13:41,100
could also be weaponizing it on
the other side by not allowing
231
00:13:41,100 --> 00:13:44,730
people to learn how to
effectively use it in day to day
232
00:13:44,730 --> 00:13:48,690
life, when it's such an
effective tool. So
233
00:13:50,460 --> 00:13:52,200
Charles Ansell: can I just add
something on that point?
234
00:13:52,410 --> 00:13:53,040
KiKi L'Italien: Yeah,
235
00:13:53,160 --> 00:13:55,620
Charles Ansell: I think, you
know, what seems particularly
236
00:13:55,620 --> 00:13:58,140
crazy to me, you know, the
social media thing is a really
237
00:13:58,140 --> 00:14:02,100
good point. There's a I forget,
there's a random philosopher,
238
00:14:02,490 --> 00:14:04,770
the Salk random, I just don't
know the person's name. But I
239
00:14:04,770 --> 00:14:07,320
heard it on another podcast who
said, when you invent
240
00:14:07,320 --> 00:14:10,020
electricity, you invent
electrocution, when you invent?
241
00:14:10,410 --> 00:14:12,600
You know, anything good, you
have not the bad thing, too.
242
00:14:12,840 --> 00:14:16,260
And, and sometimes we don't
acknowledge that we invent the
243
00:14:16,260 --> 00:14:18,510
bad thing. And in this case, I
think it's the opposite. It's
244
00:14:18,510 --> 00:14:21,540
like we just ran to Oh, no, we
invented electrocution, without
245
00:14:21,570 --> 00:14:23,940
even bothering to see what sort
of power we had in our hands
246
00:14:23,940 --> 00:14:26,820
that we could harness. And and I
think, you know, to President
247
00:14:26,820 --> 00:14:30,780
bassins point, I think it's very
telling that certain things
248
00:14:30,780 --> 00:14:35,610
that, you know, students with
way more resources have been
249
00:14:35,610 --> 00:14:39,270
able to do forever, like pay
people to write essays for them.
250
00:14:39,870 --> 00:14:43,530
Now that everybody can do it,
it's democratized. Oh, no, now
251
00:14:43,530 --> 00:14:46,110
it's bad. And I think that's a
big talent, what our priorities
252
00:14:46,110 --> 00:14:46,350
are.
253
00:14:48,390 --> 00:14:50,010
KiKi L'Italien: I think that's
really I mean, it's a great
254
00:14:50,010 --> 00:14:53,340
segue into the next question
that I had because I want to
255
00:14:53,610 --> 00:14:58,770
reach over to Audrey and give
Audrey a chance to speak to this
256
00:14:59,310 --> 00:15:02,610
great play. But the work that
you've been doing, you've been
257
00:15:02,610 --> 00:15:05,670
thinking about this quite a bit,
you've been thinking about this,
258
00:15:05,700 --> 00:15:10,890
this topic and the issue at
hand, which is, you know, what
259
00:15:10,890 --> 00:15:16,230
are we doing to actually take
the power of some of this
260
00:15:16,260 --> 00:15:20,550
generative AI? And some of the
advancements in AI to use it for
261
00:15:20,550 --> 00:15:24,210
good, right? And how can we, how
can we allow higher ed to
262
00:15:24,210 --> 00:15:28,020
actually, you know, make the
make use of this in a better
263
00:15:28,020 --> 00:15:32,100
way. So you have a playbook, you
outline in this playbook, you
264
00:15:32,100 --> 00:15:35,340
outline strategies for
leveraging AI to increase
265
00:15:35,340 --> 00:15:41,040
college completion in equity.
And so, I would love if you
266
00:15:41,040 --> 00:15:45,120
could just elaborate on one or
two of the key strategies that
267
00:15:45,120 --> 00:15:49,530
would be particularly relevant,
or association professionals who
268
00:15:49,530 --> 00:15:52,080
are listening to this now, or
maybe they're listening to it
269
00:15:52,080 --> 00:15:55,410
later, for them to pay attention
to on this.
270
00:15:56,250 --> 00:15:59,250
Audrey Ellis: Sure. And it is a
fun thing to think about right
271
00:15:59,250 --> 00:16:03,060
to treat every problem as an
opportunity. And so we really
272
00:16:03,060 --> 00:16:06,450
did try to take that mindset and
kind of flip the problem on its
273
00:16:06,450 --> 00:16:10,380
head. While we were working on
building out this playbook. So
274
00:16:11,010 --> 00:16:13,560
grateful for Charles and
Complete College America, who
275
00:16:13,560 --> 00:16:16,890
really gave us the platform to
facilitate this discussion,
276
00:16:16,890 --> 00:16:20,250
which was built by so many
thought leaders in the higher ed
277
00:16:20,250 --> 00:16:24,570
space. And so I'm going to go a
little bit more broad in terms
278
00:16:24,570 --> 00:16:28,110
of the practices or strategies
that I suggest, because we're
279
00:16:28,110 --> 00:16:30,960
still in the early stages. And I
don't think it makes a lot of
280
00:16:30,960 --> 00:16:35,160
sense to get too tactical right
now. But But kind of building
281
00:16:35,160 --> 00:16:39,600
off of Charles and President
bassins points, the first
282
00:16:39,630 --> 00:16:43,020
suggestion that I have is to
think about how you can create
283
00:16:43,020 --> 00:16:47,820
learning opportunities for your
higher education partners. In
284
00:16:47,820 --> 00:16:51,930
whatever industry you're in. I
know that today we're speaking
285
00:16:51,930 --> 00:16:56,070
to industries, really from
across the the workforce. And so
286
00:16:56,490 --> 00:16:59,430
someone actually I saw from
Facebook said, higher ed should
287
00:16:59,430 --> 00:17:02,460
partner more with industry
partners. And I couldn't agree
288
00:17:02,460 --> 00:17:05,850
more. And I think that's what's
so exciting about this podcast.
289
00:17:06,660 --> 00:17:10,980
You know, I spoke with a college
Provost recently who was working
290
00:17:10,980 --> 00:17:14,520
with their welding corporate
partner to go learn all about
291
00:17:14,520 --> 00:17:20,130
how they are using AI in their
welding work right now, and to
292
00:17:20,130 --> 00:17:24,450
then revamp all of their
curriculum to have, you know, AI
293
00:17:24,450 --> 00:17:27,150
embedded so that they're
preparing their welding students
294
00:17:27,150 --> 00:17:30,720
to graduate and with those
competencies instead of having
295
00:17:30,720 --> 00:17:35,220
to get them out in the field.
And as much as higher ed is a
296
00:17:35,220 --> 00:17:39,120
source of learning and
knowledge. We're on campus,
297
00:17:39,120 --> 00:17:42,510
right, like we're working and
doing research and working with
298
00:17:42,510 --> 00:17:46,710
our students and community
colleges, especially, are a
299
00:17:46,710 --> 00:17:51,450
really amazing intersection of
the workforce space and the
300
00:17:51,450 --> 00:17:55,470
campus environment. And we with
such a rapidly changing
301
00:17:55,470 --> 00:18:00,840
technology, we can't possibly
know exactly how it's impacting
302
00:18:00,840 --> 00:18:03,930
your industry and less, the
lines of communication are wide
303
00:18:03,930 --> 00:18:08,610
open. And I think that because
this is also a cost prohibitive
304
00:18:08,640 --> 00:18:12,480
technology, right now, there's a
lot of upfront costs with
305
00:18:12,810 --> 00:18:16,140
building your own AI
infrastructure, getting your
306
00:18:16,140 --> 00:18:19,320
data even ready to the point
where you could do that. The
307
00:18:19,320 --> 00:18:23,760
other I guess this is kind of a
sub strategy, one recommendation
308
00:18:23,790 --> 00:18:27,210
would be to work with your
industry or your higher
309
00:18:27,210 --> 00:18:31,560
education partners to support
them in their learning, even if
310
00:18:31,560 --> 00:18:35,280
that looks financial, because
it's an investment in your
311
00:18:35,280 --> 00:18:39,780
future workforce to ensure that
they're at an institution that
312
00:18:39,780 --> 00:18:42,840
is preparing them for the
expectations you're going to
313
00:18:42,840 --> 00:18:46,830
have for your future employees.
And the second piece that I'll
314
00:18:46,830 --> 00:18:51,660
touch on, is around thinking
about things in phases. We kind
315
00:18:51,660 --> 00:18:55,710
of talked about this. Charles
has a really interesting analogy
316
00:18:55,950 --> 00:19:00,180
of electrocution and
electricity. But we do tend to
317
00:19:00,180 --> 00:19:03,210
think of the ends of the
spectrum, right? Like what's the
318
00:19:03,210 --> 00:19:09,000
most AI futuristic vision
possible? And what's right now,
319
00:19:09,540 --> 00:19:12,960
but there really, in my mind are
kind of three steps that we need
320
00:19:12,960 --> 00:19:15,750
to be thinking about, which
might look different for every
321
00:19:15,750 --> 00:19:19,140
institution, depending on if
you're an association, or if
322
00:19:19,140 --> 00:19:23,700
you're an institution of higher
education, which is right now,
323
00:19:24,090 --> 00:19:28,200
the middle term, and the future.
And I think that we can't miss
324
00:19:28,200 --> 00:19:31,950
that middle term, because we're
in this process of building
325
00:19:31,950 --> 00:19:36,060
towards that potential future
moment. And if we're not kind of
326
00:19:36,060 --> 00:19:40,110
planning and thinking about all
of those stages, we're going to
327
00:19:40,110 --> 00:19:43,620
do ourselves a disservice when
we get to that middle stage. We
328
00:19:43,620 --> 00:19:48,030
also need to be thinking really
creatively and big picture like
329
00:19:48,030 --> 00:19:51,900
Charles suggested around what is
the ultimate problem we're
330
00:19:51,900 --> 00:19:55,410
trying to solve? Not just how
can we efficiency ourselves out
331
00:19:55,410 --> 00:19:59,910
of this situation in this exact
moment, so keeping all three of
332
00:19:59,910 --> 00:20:04,110
them those timeframes as top of
mind, I think applies to every
333
00:20:04,110 --> 00:20:08,070
industry right now. Because AI
is such a ubiquitous tool that
334
00:20:08,070 --> 00:20:11,550
can apply to really any element
of our lives.
335
00:20:12,390 --> 00:20:16,770
KiKi L'Italien: Yeah. And we had
a comment from someone who is
336
00:20:16,770 --> 00:20:20,310
watching live who said, Remember
when electronic calculators
337
00:20:20,310 --> 00:20:25,140
weren't allowed in the
classroom? I do. I do, actually.
338
00:20:25,470 --> 00:20:29,370
And, you know, we laugh about it
now, but it definitely wasn't
339
00:20:29,370 --> 00:20:33,660
that long ago, relatively
speaking. And, you know, it does
340
00:20:33,660 --> 00:20:38,010
feel like that, where, when you
have these tools, why wouldn't
341
00:20:38,010 --> 00:20:41,220
you figure out how to use them.
And almost it's a hindrance if
342
00:20:41,220 --> 00:20:46,710
you don't teach how to use them
accurately and and in the best
343
00:20:46,710 --> 00:20:50,160
possible way to help people to
do what they need to do better.
344
00:20:50,940 --> 00:20:51,390
Yeah,
345
00:20:51,420 --> 00:20:53,580
Audrey Ellis: and if I could
just add one piece to that, I
346
00:20:53,580 --> 00:20:59,430
think that AI the calculator is
a really convenient analogy or
347
00:20:59,430 --> 00:21:03,150
kind of mental reference point.
I've heard some interesting
348
00:21:03,150 --> 00:21:07,470
pushback on that, because AI is
applicable to so much more than
349
00:21:07,470 --> 00:21:12,270
just math, which, of course, is
not just math. But I think that
350
00:21:12,420 --> 00:21:16,020
we need to really not just teach
people how to use AI, but also
351
00:21:16,170 --> 00:21:20,070
how to decide if AI is necessary
to use and you can't really make
352
00:21:20,070 --> 00:21:23,610
that distinction if you've never
used it. So that's why we need,
353
00:21:23,850 --> 00:21:27,210
it might be good to learn your
times tables, memorize your
354
00:21:27,270 --> 00:21:31,800
multiplication figures without a
calculator. But if you know that
355
00:21:31,800 --> 00:21:33,840
there's this thing out there
that you've never been allowed
356
00:21:33,840 --> 00:21:37,410
to try, you might be very
tempted. Instead, we should be I
357
00:21:37,410 --> 00:21:41,130
think, convincing people and
giving them that kind of ability
358
00:21:41,130 --> 00:21:44,460
to judge for themselves when
it's appropriate to use and when
359
00:21:44,460 --> 00:21:45,660
it isn't, that's a
360
00:21:45,660 --> 00:21:47,610
KiKi L'Italien: great point,
because my daughter is getting
361
00:21:47,610 --> 00:21:51,210
ready to learn how to drive and
she doesn't know life without
362
00:21:51,210 --> 00:21:55,680
GPS existing. But we also know
that sometimes it doesn't always
363
00:21:55,680 --> 00:22:00,150
work. And so Doesn't she need to
know how to use an atlas if she
364
00:22:00,150 --> 00:22:03,840
needs to. Right. So these are
things that I think it's a
365
00:22:03,840 --> 00:22:06,270
really good point, you know,
when should we use it? When
366
00:22:06,270 --> 00:22:09,420
Shouldn't we? And then actually
is a thank you very much, Andre
367
00:22:09,420 --> 00:22:12,630
leading me into my next
question, which has to do with
368
00:22:12,630 --> 00:22:16,140
some of the ethical
considerations, when we're
369
00:22:16,140 --> 00:22:20,130
thinking about and specifically
thinking about associations in
370
00:22:20,130 --> 00:22:23,610
this case, but it really could
be any organization, when you're
371
00:22:23,610 --> 00:22:27,780
thinking about enhancing
engagement, enhancing learning?
372
00:22:28,410 --> 00:22:31,740
What are the ethical
considerations that you should
373
00:22:31,740 --> 00:22:35,580
be mindful of? So this is an
open question. I don't know if
374
00:22:35,580 --> 00:22:39,090
Michael, if you would like to
answer it, or Charles or Audrey,
375
00:22:39,090 --> 00:22:43,980
but it's open to anyone who
feels particularly passionate in
376
00:22:43,980 --> 00:22:46,560
their answer. Clearly,
377
00:22:46,740 --> 00:22:48,990
Michael Baston: recognizing the
importance of due diligence is
378
00:22:48,990 --> 00:22:53,400
critical. From my perspective,
you ultimately have to
379
00:22:53,400 --> 00:22:56,280
understand that people who put
the information in the
380
00:22:56,280 --> 00:23:00,930
algorithms may not necessarily
have cultural sensitivities, you
381
00:23:00,930 --> 00:23:04,140
may not have accurate
information. And so there's a
382
00:23:04,140 --> 00:23:08,520
certain amount of expectation
and whoever is going to use the
383
00:23:08,550 --> 00:23:13,200
AI, that there is going to be a
need to engage in the level of
384
00:23:13,200 --> 00:23:18,750
due diligence. Because, quite
frankly, you know, when we don't
385
00:23:18,750 --> 00:23:22,650
use this work, and we don't see
it as a tool, and we think that
386
00:23:22,650 --> 00:23:26,610
it is a solution and not a tool,
then when ultimately could be
387
00:23:26,640 --> 00:23:30,510
spreading false information, we
could be, you know, actually
388
00:23:30,570 --> 00:23:34,440
perpetrating a lot of frauds out
there with, with information. So
389
00:23:34,440 --> 00:23:37,440
it's very important from the
ethical perspective that you as
390
00:23:37,440 --> 00:23:41,820
a person who's going to utilize
the AI that you actually do your
391
00:23:41,820 --> 00:23:45,330
due diligence, so that you're
not kind of spreading out
392
00:23:45,540 --> 00:23:48,720
misinformation and bad
information, quite frankly.
393
00:23:50,040 --> 00:23:52,620
KiKi L'Italien: Well, so
Charles, were you about to say
394
00:23:52,620 --> 00:23:53,160
something?
395
00:23:53,250 --> 00:23:55,500
Charles Ansell: Well, I was
actually going to you know, I
396
00:23:55,500 --> 00:23:59,580
hope Audrey doesn't mind but I,
you know, I, I think of you know
397
00:23:59,580 --> 00:24:02,370
your story, and I think you
should be the one to share the
398
00:24:02,400 --> 00:24:05,940
one that the Boston Globe picked
up on where you know, it really
399
00:24:05,970 --> 00:24:08,490
like if you think of this
example, and what it could do in
400
00:24:08,490 --> 00:24:11,880
the classroom or for
policymakers. If AI hallucinates
401
00:24:11,880 --> 00:24:15,900
these answers and incorporates
these biases, I think it's
402
00:24:15,900 --> 00:24:18,480
emblematic of the real dangers,
but I don't want to tell Audrey
403
00:24:18,480 --> 00:24:19,380
story for her so
404
00:24:21,090 --> 00:24:24,150
Audrey Ellis: sure, I think it's
a pretty funny story. And it is
405
00:24:24,180 --> 00:24:28,830
like a kind of low stakes way to
illustrate how wrong these
406
00:24:28,830 --> 00:24:34,830
situations can end up. So
backstory I am on my last year
407
00:24:34,830 --> 00:24:38,910
of playing fantasy football. I
don't have time for it anymore,
408
00:24:38,910 --> 00:24:44,160
but I was on my husband's league
for about eight years. We played
409
00:24:44,160 --> 00:24:48,570
on Yahoo fantasy, and my team
name because I was the only
410
00:24:48,600 --> 00:24:52,770
female on the whole league was
play like a girl, which is a few
411
00:24:52,770 --> 00:24:58,200
years old by now but it's it was
like the Nike slogan for women's
412
00:24:58,200 --> 00:25:02,370
soccer and I love that and
thought, you know, why not? So I
413
00:25:02,370 --> 00:25:05,100
did terribly this year, which is
another reason for the record, I
414
00:25:05,100 --> 00:25:08,820
have one before, but this year
was not my year. And I kind of
415
00:25:08,820 --> 00:25:14,490
just gave up and I got a season
recap. That was chat GPT
416
00:25:14,490 --> 00:25:21,210
generated by Yahoo. And chat GPT
really just took my team name
417
00:25:21,240 --> 00:25:25,170
play like a girl, and the fact
that I lost and ran with it. So
418
00:25:25,170 --> 00:25:30,930
this summary was absolutely full
of suggestions that like a girl,
419
00:25:30,960 --> 00:25:35,010
I lost and had a terrible season
and like a girl, I don't know
420
00:25:35,010 --> 00:25:39,750
anything about football. And it
was honestly, this. I think,
421
00:25:39,750 --> 00:25:42,690
Charles, you're misremembering,
the Boston Globe to not pick
422
00:25:42,690 --> 00:25:47,910
this up. But I did submit, you
know, feedback to Yahoo. And I
423
00:25:47,910 --> 00:25:51,810
think that, you know, as funny
as it is, and generally, people
424
00:25:51,810 --> 00:25:54,690
are playing fantasy football as
a hobby, and hopefully not in
425
00:25:54,720 --> 00:25:58,470
many professional contexts. But
people can put kind of all kinds
426
00:25:58,470 --> 00:26:01,470
of things in their team names,
and in their smack talk and all
427
00:26:01,470 --> 00:26:04,560
of that. And the fact that all
of that was potentially
428
00:26:04,560 --> 00:26:11,190
informing this use case for an
AI bot. Unchecked is a pretty
429
00:26:11,190 --> 00:26:16,500
risky application so early in
this in the time of generative
430
00:26:16,500 --> 00:26:21,150
AI. So hopefully, they learned
their lesson, but hopefully also
431
00:26:21,150 --> 00:26:22,140
brought some laughs.
432
00:26:22,620 --> 00:26:25,140
KiKi L'Italien: Oh, my gosh,
that sounds that sounds like it
433
00:26:25,140 --> 00:26:27,780
could have been a nightmare,
would it? It is such a great
434
00:26:27,780 --> 00:26:30,570
example, though. Charles, I'm so
glad you brought it up. Because
435
00:26:30,900 --> 00:26:34,980
this is it is you can easily see
where that could have gone
436
00:26:35,040 --> 00:26:39,810
really, really, really terribly
wrong. If that had been at a
437
00:26:39,810 --> 00:26:45,030
larger scale. So excellent,
excellent example. Well, you
438
00:26:45,030 --> 00:26:48,480
know, I was wondering about, for
associations, there's an
439
00:26:48,480 --> 00:26:52,080
association for everything, as
we like to say. And so
440
00:26:52,080 --> 00:26:56,610
certainly, there's always this
concern about where can things
441
00:26:56,610 --> 00:26:59,910
go wrong? And how can we
mitigate risk? I'm sure, in
442
00:26:59,910 --> 00:27:02,820
higher ed, you're very familiar
with this concern about
443
00:27:02,820 --> 00:27:09,330
mitigating risk. So are there
ways for associations to
444
00:27:09,330 --> 00:27:16,680
advocate for or against certain
uses of AI in education, to
445
00:27:16,680 --> 00:27:21,720
ensure that it benefits a
diverse student body, because we
446
00:27:21,720 --> 00:27:25,980
have, we have associations that
are looking for its members that
447
00:27:25,980 --> 00:27:30,180
are going into its industries,
starting all the way back, going
448
00:27:30,180 --> 00:27:34,650
through school, very concerned
about making sure that this
449
00:27:34,650 --> 00:27:39,330
science or this industry is
receiving the type of education
450
00:27:39,330 --> 00:27:44,130
and protection and concern that
it needs needs to have, and
451
00:27:44,130 --> 00:27:47,160
certainly with a new technology,
or something that's advancing so
452
00:27:47,160 --> 00:27:52,140
quickly? There's a lot of
concern about like, Well, how do
453
00:27:52,140 --> 00:27:55,710
we how do we make sure it
benefits our people, our
454
00:27:55,710 --> 00:27:58,200
students, the ones that are
coming into our field?
455
00:28:00,870 --> 00:28:03,360
Charles Ansell: I can start on
that. You know, I think that
456
00:28:03,360 --> 00:28:06,510
there's two things that come to
mind for me when you say that
457
00:28:06,510 --> 00:28:11,640
cakey. So one is, I think it's,
you know, important to also just
458
00:28:11,640 --> 00:28:15,420
use our discussion as a sort of
like, analogy or model or
459
00:28:15,420 --> 00:28:18,000
whatever word you want to use
for your own Association, what
460
00:28:18,000 --> 00:28:19,680
you're trying to represent for
your members. Because the
461
00:28:19,680 --> 00:28:22,590
problems that we're grappling
with in higher education, poor
462
00:28:22,590 --> 00:28:26,130
to, you know, real estate, they
poor to health care, you know,
463
00:28:26,130 --> 00:28:31,290
are you being, you know, problem
first, in terms of what this
464
00:28:31,290 --> 00:28:35,400
technology can do? And then are
you taking a comprehensive look
465
00:28:35,460 --> 00:28:39,120
on all the issues that your
association is grappling with?
466
00:28:39,150 --> 00:28:42,090
Are you tracking those issues in
a mutually exclusive manner,
467
00:28:42,300 --> 00:28:46,530
such that you know, exactly
where AI could plug in, or any
468
00:28:46,530 --> 00:28:49,950
technology for that matter? And
what the limitations are? If you
469
00:28:49,950 --> 00:28:55,620
just don't do that, in that way,
it's going to be, I think, a
470
00:28:55,620 --> 00:28:59,040
lost cause. The second piece is,
is really getting directly to
471
00:28:59,040 --> 00:29:01,380
your question, which is, you
know, how can associations
472
00:29:01,380 --> 00:29:04,620
interact with higher education
on this? You know, I'm curious
473
00:29:04,620 --> 00:29:10,950
what others here, you know,
Audrey President passed and
474
00:29:11,550 --> 00:29:16,020
think about this. But I think
that we need to find a way to
475
00:29:16,440 --> 00:29:22,920
very quickly get AI learning
outcomes into the heads of all
476
00:29:22,920 --> 00:29:26,550
of our students in all
disciplines. Because what I fear
477
00:29:26,550 --> 00:29:30,240
is going to happen is that a lot
of social mobility ladders that
478
00:29:30,240 --> 00:29:32,610
exists to really good paying
jobs, like really strong
479
00:29:32,610 --> 00:29:34,890
careers, like thinking like
major accounting firms, major
480
00:29:34,890 --> 00:29:37,350
law firms, major consultancies,
lots of these things start at
481
00:29:37,350 --> 00:29:41,490
the associate level. And we're
beginning to see the Inklings
482
00:29:41,520 --> 00:29:44,700
that a lot of those jobs that
start these career ladders could
483
00:29:44,700 --> 00:29:48,690
go away, because AI can sweep
these knowledge bases and do
484
00:29:48,690 --> 00:29:51,330
these things that end up in
forming, you know, the the
485
00:29:51,330 --> 00:29:54,390
lawyers or the consultants of
the, you know, engagement
486
00:29:54,390 --> 00:30:00,810
managers. And so, and yet, if if
we the way At higher ed works,
487
00:30:00,810 --> 00:30:03,750
it's kind of slow, right? Like,
we have to, you know, to get new
488
00:30:03,750 --> 00:30:05,940
programs approved and all this,
you have to go through this
489
00:30:05,940 --> 00:30:08,430
accreditation process. And you
know, sometimes these are multi
490
00:30:08,430 --> 00:30:12,630
year things. And so I think, you
know, there's, there's players
491
00:30:12,630 --> 00:30:16,410
in the field of post secondary,
that associations, you know, and
492
00:30:16,410 --> 00:30:19,110
I think complete college,
America is one of them. And I
493
00:30:19,110 --> 00:30:21,510
encourage folks to, you know, if
they want to reach out after
494
00:30:21,510 --> 00:30:27,870
this, maybe they could have my
contact info. But that can look
495
00:30:27,870 --> 00:30:31,650
at ways to make sure that the
course by course level, and at
496
00:30:31,650 --> 00:30:34,980
the certificate level, that
these competencies as the field
497
00:30:35,010 --> 00:30:39,810
is just starting, that they get
swept in right away. So then
498
00:30:39,810 --> 00:30:44,070
that way, students are ready in
their existing programs of
499
00:30:44,070 --> 00:30:47,010
study. So we don't need to wait,
the several years that higher ed
500
00:30:47,010 --> 00:30:48,810
always takes to catch up to
these things, if that makes
501
00:30:48,810 --> 00:30:51,030
sense. You know, I think for
people who are not in higher ed,
502
00:30:51,360 --> 00:30:54,990
it, you know, so often, we think
of the 7% of colleges that are
503
00:30:54,990 --> 00:30:57,840
selective enrollment that get
hauled in front of the Senate to
504
00:30:57,840 --> 00:31:00,540
get yelled at and stuff, because
that's where like, most of the
505
00:31:00,540 --> 00:31:04,020
media went to college, but like
93% of students are at, you
506
00:31:04,020 --> 00:31:06,120
know, non selective colleges,
and most of them are in a
507
00:31:06,120 --> 00:31:08,610
resource, and lots of them
through no fault of their own
508
00:31:08,610 --> 00:31:10,770
have to move a little slower
than they could otherwise. And
509
00:31:10,770 --> 00:31:13,920
so we need to find ways to fund
that, but also disrupt it so
510
00:31:13,920 --> 00:31:17,520
that way, students are ready,
not just in terms of the AI that
511
00:31:17,520 --> 00:31:19,980
is used to deliver higher
education, which has been our
512
00:31:19,980 --> 00:31:24,450
discussion so far, but also
being a successful lung, which
513
00:31:24,450 --> 00:31:30,000
is the point of higher ed is
including being very competent
514
00:31:30,000 --> 00:31:30,570
in AI.
515
00:31:32,100 --> 00:31:34,950
KiKi L'Italien: I mean, well
said, and actually, I want to
516
00:31:34,980 --> 00:31:38,880
give a shout out to a friend of
ours in the association space
517
00:31:38,880 --> 00:31:42,270
six degrees of associations that
says love this discussion, so
518
00:31:42,270 --> 00:31:46,860
many aha moments in this one.
And I have to say, first of all,
519
00:31:46,890 --> 00:31:51,420
check out their podcast to also
can't can't help but agree. I
520
00:31:51,420 --> 00:31:54,330
mean, I think that there are a
lot of aha moments I've
521
00:31:54,330 --> 00:31:58,500
certainly heard in this one. And
and you know, Charles, when you
522
00:31:58,500 --> 00:32:01,620
were talking about it, I was
thinking, absolutely.
523
00:32:01,620 --> 00:32:04,650
Associations are always thinking
about like, what's the future
524
00:32:04,650 --> 00:32:08,580
for the industry? What's the
future for our members? How do
525
00:32:08,580 --> 00:32:12,510
we best prepare them, like
higher ed, higher ed is also
526
00:32:12,510 --> 00:32:15,270
thinking about this. And when
you want to talk about some
527
00:32:15,270 --> 00:32:20,160
really great partnerships, all
you know, higher ed, and the and
528
00:32:20,160 --> 00:32:23,040
the associations that are
working to represent some of
529
00:32:23,040 --> 00:32:26,880
those industries, that we're all
concerned about the students
530
00:32:26,880 --> 00:32:29,250
that are going through the
individuals who are going
531
00:32:29,250 --> 00:32:33,390
through, and then forming the
workforce and and really the
532
00:32:33,390 --> 00:32:37,230
society that's around us. So
that all leads me to this next
533
00:32:37,230 --> 00:32:40,140
question that has to do with
what are some examples where
534
00:32:40,170 --> 00:32:44,550
it's already beginning to work?
Or we're starting to see, okay,
535
00:32:44,580 --> 00:32:50,070
we're using AI in a meaningful
way that is successful. That's
536
00:32:50,070 --> 00:32:53,670
something that we would look at,
and we would say, okay, AI, your
537
00:32:53,670 --> 00:32:58,230
we have you in this higher
educational setting? What
538
00:32:58,230 --> 00:33:02,700
lessons can we learn from some
of the early success stories
539
00:33:03,420 --> 00:33:05,190
that we're hearing, we
540
00:33:05,190 --> 00:33:09,960
Michael Baston: do have faculty
right now who are fully engaged
541
00:33:10,080 --> 00:33:14,460
in piloting a number of
innovative utilizations of AI.
542
00:33:14,460 --> 00:33:18,570
So I think that, you know, while
there might be this suggestion,
543
00:33:18,750 --> 00:33:24,480
that higher education writ large
is sort of concerned about and
544
00:33:24,480 --> 00:33:26,760
don't want to get involved in
AI. That's not the case
545
00:33:26,760 --> 00:33:29,550
everywhere. There are a lot of
places that recognize that we
546
00:33:29,550 --> 00:33:33,060
can't unilaterally disarm
opportunity for students. The
547
00:33:33,060 --> 00:33:36,090
fact is, if we want graduates
who are going to effectively
548
00:33:36,090 --> 00:33:39,300
navigate the workplace
environment, and this technology
549
00:33:39,300 --> 00:33:43,410
is now an expectation in that
environment, if we don't look at
550
00:33:43,410 --> 00:33:47,130
our learning objectives, and now
reassess them, and actually
551
00:33:47,160 --> 00:33:51,630
invigorates our syllabi, in ways
that allow them to develop the
552
00:33:51,630 --> 00:33:54,690
skill sets are, they're not
going to be as successful and
553
00:33:54,690 --> 00:33:57,180
we're not going to be as
effective in the work that we
554
00:33:57,180 --> 00:34:00,450
do. So there are a lot of folks
all around the country who are
555
00:34:00,450 --> 00:34:04,530
in higher education spaces that
are already starting to really
556
00:34:04,560 --> 00:34:09,240
examine this work. You look at
Maricopa, that, that system
557
00:34:09,240 --> 00:34:13,770
actually has degrees now in a
brand new degree. So So while it
558
00:34:13,770 --> 00:34:18,570
often takes a long time for some
in higher ed to move, not in a
559
00:34:18,570 --> 00:34:22,230
non degree space, you can run
certificate programs are pretty
560
00:34:22,230 --> 00:34:25,680
flexibly, you can build those
stackable credentials into
561
00:34:25,800 --> 00:34:29,550
existing programs. So so it's
not that we are completely
562
00:34:29,550 --> 00:34:33,030
handcuffed from being
innovative. What we have to do
563
00:34:33,030 --> 00:34:36,990
is understand the importance of
balancing AI with really the
564
00:34:36,990 --> 00:34:41,010
human interaction because it's
not gonna replace all of the
565
00:34:41,010 --> 00:34:46,020
humans as well, but we've got to
learn how to best integrate the
566
00:34:46,020 --> 00:34:47,010
efforts together.
567
00:34:47,760 --> 00:34:50,910
KiKi L'Italien: Right, that
balanced approach, I think is is
568
00:34:51,060 --> 00:34:55,410
it's full of nuances. And as
with anything that has nuances,
569
00:34:55,410 --> 00:34:58,890
it's very difficult when when
something's not black or white,
570
00:34:58,920 --> 00:35:04,560
it's very difficult To know what
the right thing is, and the best
571
00:35:04,560 --> 00:35:09,030
way to navigate it. But we have
to we have to figure out the
572
00:35:09,030 --> 00:35:12,480
best way forward. Andre, do you
have examples? What are you
573
00:35:12,480 --> 00:35:15,990
hearing? I know you're pulling
together stories and and talking
574
00:35:15,990 --> 00:35:16,770
with everyone.
575
00:35:17,940 --> 00:35:21,000
Audrey Ellis: Yeah, we are
creating a council on equitable
576
00:35:21,000 --> 00:35:24,060
AI or we have created and we're
meeting later this week in
577
00:35:24,060 --> 00:35:27,720
person for the first time, which
is really exciting. And so we
578
00:35:27,720 --> 00:35:31,860
hope that that will really help
us kind of gather and collect
579
00:35:31,860 --> 00:35:34,830
these practices, because I
couldn't agree more President
580
00:35:34,830 --> 00:35:40,650
Bastien that this is happening
in incredible scope and scale,
581
00:35:40,680 --> 00:35:43,830
unfortunately, what I'm seeing
is that it's happening in a
582
00:35:43,830 --> 00:35:47,730
really decentralized way, which
is not bad for innovation, don't
583
00:35:47,730 --> 00:35:51,390
get me wrong, but makes it
really hard to learn from what
584
00:35:51,390 --> 00:35:55,500
others are doing and replicate
without starting from scratch in
585
00:35:55,500 --> 00:36:01,590
every pocket. And so you have
institutions that are maybe have
586
00:36:01,590 --> 00:36:05,910
some pockets of enthusiasm, and
some pockets of terror, or
587
00:36:06,390 --> 00:36:08,250
however you want to call it,
whatever you want to call it.
588
00:36:08,460 --> 00:36:11,040
And then you have institutions
themselves, where you have some
589
00:36:11,040 --> 00:36:13,440
institutions that are just
totally on board and some that
590
00:36:13,440 --> 00:36:17,130
are not. So I think that that
that is, it's really
591
00:36:17,130 --> 00:36:20,130
interesting, because there's a
lot of great examples of AI
592
00:36:20,160 --> 00:36:23,760
broadly, right, and what it's
how it's been used in AI, in
593
00:36:23,760 --> 00:36:27,600
higher education, you know,
Georgia State University, as in
594
00:36:27,600 --> 00:36:30,870
a lot of retention work for
years, and kind of initially
595
00:36:30,870 --> 00:36:36,780
started a lot of that focus.
Recently, John J. Work partnered
596
00:36:36,780 --> 00:36:40,950
with a startup that was, you
know, a nonprofit startup to
597
00:36:41,220 --> 00:36:44,340
figure out how to increase their
retention. But these are our
598
00:36:44,460 --> 00:36:50,430
old, not older, but less, kind
of groundbreaking types of AI,
599
00:36:50,460 --> 00:36:55,380
because AI is a wide swath of
technology. What we're really
600
00:36:55,380 --> 00:36:59,010
interested right now, and
everyone is is generative AI,
601
00:36:59,040 --> 00:37:03,480
chat, GPT, things like that, and
how that can then inform
602
00:37:03,510 --> 00:37:06,930
everything else, even other
types of AI. And that's where I
603
00:37:06,930 --> 00:37:10,290
think we still haven't really
been able to put a pin on
604
00:37:10,500 --> 00:37:14,430
exactly what institutions are
doing necessarily, or anything
605
00:37:14,430 --> 00:37:18,390
beyond pilots, because there
hasn't been enough time, we need
606
00:37:18,390 --> 00:37:22,710
to see how this goes. But we
also can't wait necessarily,
607
00:37:23,070 --> 00:37:27,630
until they're done or until the
academic peer review process
608
00:37:27,840 --> 00:37:30,810
finalizes so that, you know, a
journal article can get
609
00:37:30,810 --> 00:37:34,950
published to talk about it,
because I can guarantee that
610
00:37:34,950 --> 00:37:37,800
other institutions are thinking
about it or attempting it
611
00:37:37,800 --> 00:37:41,370
themselves, and would benefit
greatly from hearing kind of
612
00:37:41,370 --> 00:37:45,600
more intermediate updates on how
things are going. So that's why
613
00:37:45,600 --> 00:37:48,390
we're really trying to
facilitate that more rapid
614
00:37:48,420 --> 00:37:51,540
lesson sharing practices through
the playbook through the
615
00:37:51,540 --> 00:37:56,730
council, so that we're not, you
know, hurry up, then wait, hurry
616
00:37:56,730 --> 00:38:01,020
up and wait type of timeline and
process because AI is not
617
00:38:01,020 --> 00:38:04,050
waiting. It is just like full
speed ahead, whether or not we
618
00:38:04,050 --> 00:38:06,270
get on its timeline? Well.
619
00:38:09,120 --> 00:38:11,400
Charles Ansell: You know, I
think, you know, in terms of the
620
00:38:11,400 --> 00:38:13,500
use cases, just thinking about
the associations in the
621
00:38:13,500 --> 00:38:17,520
audience, I think that it's
important to take a look at the
622
00:38:17,550 --> 00:38:20,880
playbook that I believe was
shared, you know, around or can
623
00:38:20,880 --> 00:38:24,600
be shared in the chat that that
Audrey is referring to. Because
624
00:38:24,750 --> 00:38:26,910
even though it's highly
specific, I think it's organized
625
00:38:26,910 --> 00:38:29,160
into things that all
organizations end up benefiting
626
00:38:29,160 --> 00:38:32,190
from, right, like we split it
by, like, you know, teaching and
627
00:38:32,190 --> 00:38:36,030
learning and student success and
organizational effectiveness,
628
00:38:36,930 --> 00:38:40,080
and, and data. And sure some of
these won't be one to one,
629
00:38:40,080 --> 00:38:41,970
right, especially like the
teaching and learning session,
630
00:38:42,000 --> 00:38:44,820
although most companies have
professional development, right?
631
00:38:45,510 --> 00:38:48,630
Most companies have a knowledge
base. And I think that the big
632
00:38:48,630 --> 00:38:51,450
thing to get those use cases
going at such an embryonic stage
633
00:38:51,450 --> 00:38:54,360
for the industry is to get the
sample prompts into more and
634
00:38:54,360 --> 00:38:58,260
more hands. And and so whether
it's porting these over
635
00:38:58,260 --> 00:39:01,950
yourself, or even asking a
generative AI to like do that
636
00:39:01,950 --> 00:39:05,010
work for you? How can this
document work for my industry, I
637
00:39:05,010 --> 00:39:07,170
think that'd be like a very
worthwhile task and could save
638
00:39:07,170 --> 00:39:08,190
hundreds of hours of time.
639
00:39:08,910 --> 00:39:10,950
KiKi L'Italien: I mean, it
really could. And what I was
640
00:39:10,950 --> 00:39:14,940
gonna say is, and I love I love
when somebody can share their
641
00:39:14,940 --> 00:39:18,870
prompt, and I know that it's
going to get me closer to more
642
00:39:18,870 --> 00:39:22,920
usable information that I can
then apply and customize for
643
00:39:22,920 --> 00:39:27,360
myself. So excellent, excellent
advice there. And for anyone
644
00:39:27,360 --> 00:39:31,140
who's who's listening to this
now or listening to it later.
645
00:39:31,710 --> 00:39:34,740
What I'm going to do is I'm
gonna grab the link to that
646
00:39:34,740 --> 00:39:37,290
playbook. You want to check out
the playbook. I'll put it in the
647
00:39:37,290 --> 00:39:40,980
show notes from the edited
version of the show, and then
648
00:39:40,980 --> 00:39:45,750
you'll be able to download that
there and go check it out. You
649
00:39:45,750 --> 00:39:50,130
know, one thing you said
earlier, Audrey was about you
650
00:39:50,130 --> 00:39:53,580
talked about, you can't wait
necessarily for the peer
651
00:39:53,580 --> 00:39:57,570
reviewed everything to come out
when there's something advancing
652
00:39:57,570 --> 00:40:01,800
so quickly right in the case of
AI, we can't wait until the end
653
00:40:01,800 --> 00:40:03,960
of the day. It's not like it's
going to just stop and say,
654
00:40:03,960 --> 00:40:06,930
Okay, now catch up. Let's wait
for all the peer reviewed
655
00:40:06,930 --> 00:40:10,050
research to be done. We have to
be sort of developing, as they
656
00:40:10,050 --> 00:40:12,510
say, you know, while the planes
in the air, we have to be
657
00:40:12,510 --> 00:40:18,150
building the plane, right, so.
So my point to that, is this
658
00:40:18,150 --> 00:40:24,120
quantifying success? How can we
look at and this is forever, the
659
00:40:24,120 --> 00:40:27,600
thing that I know, my friends in
association land are looking
660
00:40:27,600 --> 00:40:31,800
for. I know, like any
professional, really any anyone
661
00:40:31,800 --> 00:40:34,590
in higher ed, you're looking
for? How can I quantify that
662
00:40:34,590 --> 00:40:37,260
this is something that's
successful? So what are we
663
00:40:37,260 --> 00:40:41,400
looking at? What are some
measures of success that we
664
00:40:41,400 --> 00:40:45,300
might be able to point to to
say, I yeah, here's where it's
665
00:40:45,300 --> 00:40:51,030
working. Here's where we know
that either the efficiency, or
666
00:40:51,450 --> 00:40:54,690
the new ideas and innovation
that's a result of it, the new
667
00:40:54,690 --> 00:40:58,740
products, etc, etc, it can come
into play?
668
00:41:01,050 --> 00:41:02,070
Charles Ansell: Does that for
anybody?
669
00:41:02,430 --> 00:41:03,930
KiKi L'Italien: This for anyone?
And?
670
00:41:10,860 --> 00:41:15,000
You go, Charles, I love it. I
love it answers.
671
00:41:16,200 --> 00:41:17,790
Charles Ansell: Well, I'll just
say that I think that there's
672
00:41:17,790 --> 00:41:19,770
probably two different types of
metrics, right. And I'm sure
673
00:41:19,770 --> 00:41:22,440
that this ports over into other
sectors for higher ed, there's
674
00:41:22,440 --> 00:41:24,690
like your students success
metrics. And then there's like
675
00:41:24,690 --> 00:41:28,260
the operational metrics that you
use to get that done. Right. And
676
00:41:28,260 --> 00:41:32,760
so, you know, when we look at
things like, you know, are we
677
00:41:32,970 --> 00:41:35,550
increasing our graduation rates?
are we increasing retention,
678
00:41:35,550 --> 00:41:37,440
there's usually going to be
leading indicators of that,
679
00:41:37,470 --> 00:41:39,870
like, how many students are on
education plans? And can we
680
00:41:39,870 --> 00:41:42,960
track that like week to week,
because we know that the proofs
681
00:41:42,960 --> 00:41:46,770
out there, that's mastery by
semester education plans that
682
00:41:46,770 --> 00:41:49,500
actually registered towards them
end up graduating at a higher
683
00:41:49,500 --> 00:41:51,600
rate? I'm sure there's like
analogous outcomes, metrics
684
00:41:51,600 --> 00:41:55,170
versus process metrics for all
types of sectors on this call.
685
00:41:55,860 --> 00:41:59,610
And so I think that you probably
get pretty deep into some weedy
686
00:42:01,320 --> 00:42:04,410
AI metrics, right. So it's like,
what percent of students are in
687
00:42:04,410 --> 00:42:07,110
education plans, you go one
layer down? To what extent were
688
00:42:07,110 --> 00:42:09,990
they created by this technology
of that technology and tracking
689
00:42:09,990 --> 00:42:13,770
that like day to week? I think
that we're very nascent here. So
690
00:42:13,770 --> 00:42:16,140
we haven't like made that full
metrics tree as a secretary at
691
00:42:16,140 --> 00:42:18,630
AI, or at least I haven't seen
it, because we need to, like
692
00:42:18,630 --> 00:42:21,420
start getting more of these use
cases. I would also say that,
693
00:42:22,410 --> 00:42:25,020
again, I don't know how this
ports outside of higher ed, but
694
00:42:25,230 --> 00:42:27,450
I'm curious if Audrey and
President Bassett agree with
695
00:42:27,450 --> 00:42:31,500
this. But I think that we can
also like fetishize the
696
00:42:31,830 --> 00:42:35,550
causation versus correlation
thing a little too much. Because
697
00:42:35,550 --> 00:42:37,500
in higher ed, you know, I think
about the things that you know,
698
00:42:37,500 --> 00:42:40,230
Tracy is doing in other colleges
where you're doing a lot of
699
00:42:40,230 --> 00:42:43,440
things at once, and you're not
waiting to say, Oh, it was the
700
00:42:43,440 --> 00:42:46,230
eggplants. Oh, it was making
sure every student's on advisor,
701
00:42:46,320 --> 00:42:49,170
and I think AI is just going to
be in that mix in that way. And
702
00:42:49,170 --> 00:42:52,260
we're never going to be able to,
you know, if you make an
703
00:42:52,260 --> 00:42:54,750
academic scholarly article about
it, that would be helpful for
704
00:42:54,750 --> 00:42:56,850
the field if somebody wants to
do a randomized controlled
705
00:42:56,850 --> 00:42:59,280
trial, but the problem is, if
it's a common sense
706
00:42:59,280 --> 00:43:02,400
implementation solution, that
you hate to have somebody be the
707
00:43:02,400 --> 00:43:05,340
control group. So I hope that
makes sense. It wasn't too like
708
00:43:05,340 --> 00:43:05,790
nerdy.
709
00:43:07,200 --> 00:43:08,220
KiKi L'Italien: I don't think it
was, too.
710
00:43:09,930 --> 00:43:12,480
Audrey Ellis: No, I agree with
you, Charles, I think that it's
711
00:43:12,540 --> 00:43:15,240
just like all of these other
interventions that we talked
712
00:43:15,240 --> 00:43:18,570
about, it's really difficult to
isolate the effects of one
713
00:43:18,570 --> 00:43:21,390
specific thing if you have
people who are benefiting,
714
00:43:21,390 --> 00:43:25,350
hopefully benefiting from many.
So I think that that's a fair
715
00:43:26,250 --> 00:43:29,490
assertion to make. One other
piece that I want to share is
716
00:43:29,520 --> 00:43:34,380
around more of like a systems
level thinking about knowledge,
717
00:43:34,410 --> 00:43:38,700
which maybe sounds really broad
and lofty. But I think that one,
718
00:43:39,060 --> 00:43:43,110
one of the things that we might,
that might be as a hypothesis, I
719
00:43:43,110 --> 00:43:48,180
guess, at the root of some of
the kind of panic around AI is
720
00:43:48,420 --> 00:43:52,860
that AI is a threat to kind of
how we value knowledge, how we
721
00:43:52,860 --> 00:43:55,740
think it can be assessed, or it
should be assessed how we think
722
00:43:55,740 --> 00:43:58,890
it should be measured. And I
think that we're gonna see a lot
723
00:43:58,920 --> 00:44:04,020
of conversation here in the
coming months and years ahead,
724
00:44:04,020 --> 00:44:08,040
because as the purveyors of
knowledge and higher education.
725
00:44:08,460 --> 00:44:11,640
I think there's a feeling that
kind of, we know, we're the
726
00:44:11,640 --> 00:44:16,320
experts on that. But this is
really going to shake up and
727
00:44:16,320 --> 00:44:22,080
shift potentially, how we view
all of that and how we do assess
728
00:44:22,080 --> 00:44:25,620
that. And so one leading
indicator that I'm looking for,
729
00:44:25,980 --> 00:44:29,010
in on the equity space, in
particular on equity
730
00:44:29,010 --> 00:44:33,570
conversation, in particular, is
for new types of knowledge and
731
00:44:33,600 --> 00:44:37,620
voices and perspectives from
from groups of people who have
732
00:44:37,620 --> 00:44:41,700
historically not had their
value, their knowledge valued at
733
00:44:41,700 --> 00:44:46,140
the same level as past maybe
folks who can make it all the
734
00:44:46,140 --> 00:44:49,320
way through an academic journey
all the way to that point of
735
00:44:49,320 --> 00:44:53,460
peer reviewed work. I want to
see new voices and I want I
736
00:44:53,460 --> 00:44:57,930
think AI will be working if it's
helping bubble up and surface,
737
00:44:58,050 --> 00:45:02,400
different perspectives that In
the past, because of kind of the
738
00:45:02,400 --> 00:45:06,240
structure of, of higher
education, those perspectives
739
00:45:06,240 --> 00:45:10,020
might have been kept out. So
that's more on like a societal
740
00:45:10,020 --> 00:45:13,530
systems level. But I think that
will be a really great signifier
741
00:45:13,530 --> 00:45:17,040
that we're doing something right
if we're making new paths for
742
00:45:17,040 --> 00:45:20,460
people to contribute to our way
of thinking through AI.
743
00:45:21,510 --> 00:45:24,660
Michael Baston: And I would just
add, from my perspective, as
744
00:45:24,660 --> 00:45:29,250
well about expanding capacity,
how does the utilization of AI
745
00:45:29,430 --> 00:45:34,230
enable us to expand the kind of
capacity so that we can actually
746
00:45:34,230 --> 00:45:37,950
get more people getting the help
they need when they need it at
747
00:45:37,950 --> 00:45:41,490
times that are convenient to
them. That's why from my
748
00:45:41,490 --> 00:45:45,990
perspective, it is critical for
us to be able to really master
749
00:45:46,020 --> 00:45:49,380
prompt generation, so that we
actually can get the right
750
00:45:49,380 --> 00:45:54,240
quality information. And we can
actually use ways to actually
751
00:45:54,300 --> 00:45:58,530
streamline processes and
actually accelerate the ability
752
00:45:58,530 --> 00:46:01,560
to people to navigate complex
systems that don't need to be
753
00:46:01,560 --> 00:46:05,160
complex. You know, when we think
about the bureaucratic structure
754
00:46:05,160 --> 00:46:08,160
of higher education, it was
supposed to actually not help
755
00:46:08,160 --> 00:46:11,940
the student, but actually help
the bureaucracy of how the the
756
00:46:11,940 --> 00:46:16,740
administration of higher
education Well, if we now don't
757
00:46:16,740 --> 00:46:20,280
need all of that bureaucracy
that supports the actual
758
00:46:20,280 --> 00:46:22,980
structures, but not the
students. And we actually can
759
00:46:22,980 --> 00:46:28,170
utilize some technology that
gets at minimizing all of that
760
00:46:28,230 --> 00:46:32,880
bloat. Wow, what a wonderful
breakthrough opportunity for
761
00:46:32,880 --> 00:46:35,760
students to not have to, you
know, at three o'clock in the
762
00:46:35,760 --> 00:46:38,340
morning, when they're up, and
they need somebody that they
763
00:46:38,340 --> 00:46:41,880
actually can go to a knowledge
base that can resolve issues
764
00:46:41,970 --> 00:46:44,250
without talking to somebody
who's going to send them
765
00:46:44,250 --> 00:46:46,950
someplace else. And then they
play the chutes and ladders game
766
00:46:46,950 --> 00:46:51,870
until they drop out of school.
So here's an opportunity if we
767
00:46:51,870 --> 00:46:54,180
think about it, and we utilize
it. Well, I
768
00:46:54,180 --> 00:46:56,490
KiKi L'Italien: mean, the people
who are listening to this as an
769
00:46:56,490 --> 00:47:00,270
audio podcast later, they cannot
see me right now, but I'm just
770
00:47:00,270 --> 00:47:03,990
like furiously shaking my head,
my hands, anything that can move
771
00:47:04,440 --> 00:47:10,290
in the camera, just in furious
agreement. Because it's so true.
772
00:47:10,290 --> 00:47:13,620
Can you imagine not having all
of the bureaucracy that's
773
00:47:13,620 --> 00:47:17,550
pressing back and making things
so difficult? It's hard. It's
774
00:47:17,550 --> 00:47:20,940
hard for me to imagine, but
you're painting a picture that I
775
00:47:20,940 --> 00:47:26,100
really want to see painted. So
you're the let's go there. And
776
00:47:26,100 --> 00:47:30,150
actually, to that end, my my
last question for you, as I'm,
777
00:47:30,180 --> 00:47:34,470
as I'm wrapping things up, which
by the way, this has been a
778
00:47:34,470 --> 00:47:37,860
phenomenal conversation today.
So thank you so much ahead of
779
00:47:37,860 --> 00:47:43,020
time, but where do you see this
work that you're doing going?
780
00:47:43,020 --> 00:47:46,380
Now all of you know each other,
you're you're paying attention
781
00:47:46,380 --> 00:47:50,490
to what's happening with the
development of AI. And as it
782
00:47:50,490 --> 00:47:55,680
pertains to higher ed, where do
you see what do you see for the
783
00:47:55,680 --> 00:47:58,500
work that you're doing right now
to sort of help shape this
784
00:47:58,500 --> 00:48:02,100
conversation around AI and
higher education as you're
785
00:48:02,100 --> 00:48:07,230
moving forward? What is to come?
All right, and I see you,
786
00:48:07,230 --> 00:48:08,760
Adrienne, go, I'm looking for
you.
787
00:48:09,030 --> 00:48:13,950
Audrey Ellis: Sure. You know,
there's an interesting, popular,
788
00:48:15,240 --> 00:48:17,640
I'm not maybe they're an
association, actually another
789
00:48:17,640 --> 00:48:21,750
thinking about a group in higher
education, EDUCAUSE, they
790
00:48:21,750 --> 00:48:25,980
released a report yesterday, an
AI landscape report, and I was
791
00:48:25,980 --> 00:48:29,130
spending some time with it over
the past two days, and something
792
00:48:29,130 --> 00:48:33,060
that I, you know, observed, and
being the data person that I am,
793
00:48:33,060 --> 00:48:35,850
who loves community colleges,
I'm always jumping to the end,
794
00:48:35,880 --> 00:48:38,910
or if there's a survey and
seeing Okay, well, who are they
795
00:48:38,910 --> 00:48:42,480
talking about? And who was
included in the survey? And what
796
00:48:42,480 --> 00:48:46,800
groups do they represent. And so
this is really interesting
797
00:48:46,800 --> 00:48:50,370
report, lots of great insights
that I think can relate to lots
798
00:48:50,370 --> 00:48:54,030
of industries. A lot of things
we covered today, strategic
799
00:48:54,030 --> 00:48:57,870
planning for AI prepare your
data for aI think about ethical
800
00:48:57,870 --> 00:49:01,650
and equitable use cases. But
what I'm still seeing is, you
801
00:49:01,650 --> 00:49:05,670
know, 13% of the participants
were from the community college
802
00:49:05,670 --> 00:49:10,170
space, and about 21% of the
participants were from minority
803
00:49:10,170 --> 00:49:14,670
serving institutions, which is
not on which is not does not
804
00:49:14,670 --> 00:49:18,720
represent parity, if you look at
the survey, you know, the survey
805
00:49:18,720 --> 00:49:23,520
population and our larger post
secondary space. And so what I
806
00:49:23,520 --> 00:49:27,270
hope will become an emerging
trend is that we can push the
807
00:49:27,270 --> 00:49:33,630
edges of our data collection and
our conversations, our sharing
808
00:49:33,660 --> 00:49:39,000
of knowledge and resources,
right, like funding and legal
809
00:49:39,000 --> 00:49:44,490
support and technical support to
these institutions who right now
810
00:49:44,490 --> 00:49:48,300
aren't even participating in
surveys or and therefore aren't
811
00:49:48,300 --> 00:49:51,270
even being considered part of
the landscape when they are
812
00:49:51,780 --> 00:49:55,980
because we're not making true
truly representative statements
813
00:49:55,980 --> 00:49:59,220
like most colleges are doing
this if we really aren't even
814
00:49:59,220 --> 00:50:02,850
including You know, so many
community colleges that educate
815
00:50:02,880 --> 00:50:07,260
a vast majority of our students
in the country that are in the
816
00:50:07,260 --> 00:50:11,190
post secondary space, and
probably your future workers at,
817
00:50:11,880 --> 00:50:13,980
you know, all of the
associations and the industries
818
00:50:13,980 --> 00:50:16,830
you represent. So that's one
space where I'd really like us
819
00:50:16,830 --> 00:50:21,480
to see the conversation pushed
and, you know, supported
820
00:50:21,510 --> 00:50:22,110
further.
821
00:50:23,010 --> 00:50:24,090
KiKi L'Italien: Absolutely.
822
00:50:25,080 --> 00:50:27,870
Charles Ansell: Charles. Yeah. I
mean, I think what Audrey said,
823
00:50:27,870 --> 00:50:30,990
I mean, I think it's going to be
matter more of seeing where the
824
00:50:30,990 --> 00:50:33,300
conversations going. But like,
more importantly, like moving
825
00:50:33,300 --> 00:50:36,450
the conversation, I think it's
going to be kind of pointless to
826
00:50:36,450 --> 00:50:39,000
speculate as to where it's gonna
go. You gave the example earlier
827
00:50:39,000 --> 00:50:41,340
of social media. And then we
repress that somebody put in the
828
00:50:41,340 --> 00:50:43,710
chat that item about
calculators, like, we never
829
00:50:43,710 --> 00:50:46,230
know, these things, like when
email came out in the 90s. It
830
00:50:46,230 --> 00:50:48,690
wasn't like, you know, what's
going to happen? This, and then
831
00:50:48,690 --> 00:50:51,240
it was like, even remotely
correct, like 10 years later.
832
00:50:51,240 --> 00:50:53,880
And so I think that's what we're
seeing now. So I think like to
833
00:50:53,880 --> 00:50:56,670
Adrienne's point about, like,
how are we going to like hacked
834
00:50:56,670 --> 00:51:00,300
right now, I would consider two
or three things, right? Like one
835
00:51:00,300 --> 00:51:05,250
is that we've got to like stay
vigilantly ahead of where AI is
836
00:51:05,250 --> 00:51:09,120
going and not be, you know,
doing too much punditry or
837
00:51:09,120 --> 00:51:13,620
something. The second thing is
like we have to shift. I don't
838
00:51:13,620 --> 00:51:16,050
know the best way to put this
but like shift the moral urgency
839
00:51:16,380 --> 00:51:20,340
around things like plagiarism to
the problems that we need to
840
00:51:20,340 --> 00:51:24,060
solve. So instead of moral
panics for things, like, you
841
00:51:24,060 --> 00:51:26,730
know, oh my gosh, somebody's
gonna write their essay using
842
00:51:26,730 --> 00:51:29,430
Chechi. Btw, how about we have
the moral panic about a one in
843
00:51:29,430 --> 00:51:32,400
five graduation rate? How about
we have a moral panic about
844
00:51:32,400 --> 00:51:35,220
countries, leapfrogging us in
higher education attainment of
845
00:51:35,220 --> 00:51:38,250
having an innovation gap during
climate change and COVID and
846
00:51:38,250 --> 00:51:41,460
threats to democracy, we need to
be a beacon to the world again.
847
00:51:41,460 --> 00:51:44,100
And that's what Higher Ed was
supposed to be in the post war
848
00:51:44,100 --> 00:51:46,560
era. And that's how they talked
about it. And I think we need to
849
00:51:46,560 --> 00:51:50,190
have that orientation again, and
bring generative AI in that
850
00:51:50,190 --> 00:51:52,680
orientation. And then I guess
the last thing is, this is just
851
00:51:52,680 --> 00:51:56,190
building off of what President
passen said, but like, I think
852
00:51:56,640 --> 00:51:59,760
the lowest hanging fruit is that
bureaucracy is that bloat. You
853
00:51:59,760 --> 00:52:02,850
know, I think that that's even
true with just writing, right? I
854
00:52:02,850 --> 00:52:06,510
mean, you wouldn't trust AI so
much to write you a novel. But
855
00:52:06,540 --> 00:52:09,540
in terms of making some anodyne
policy that gets the job done
856
00:52:09,780 --> 00:52:14,010
like it can we find all the ways
to make life hard for students.
857
00:52:14,010 --> 00:52:16,800
And I'm sure that's true in
health care for patients and
858
00:52:17,280 --> 00:52:20,160
just pick your sector pick the
association here. And I think
859
00:52:20,160 --> 00:52:22,350
you start with the low hanging
fruit and exactly what President
860
00:52:22,350 --> 00:52:25,860
Bastian said about killing the
Bloat killing the bureaucracy,
861
00:52:25,860 --> 00:52:28,440
this could really be a
bureaucracy killer in the best
862
00:52:28,440 --> 00:52:29,070
way possible.
863
00:52:29,880 --> 00:52:33,300
KiKi L'Italien: Oh, gosh, where
is it? The words that speak
864
00:52:33,300 --> 00:52:34,680
right to my heart.
865
00:52:37,590 --> 00:52:40,290
Michael Baston: I would just
add, that we've got to be
866
00:52:40,290 --> 00:52:45,600
careful of false choices. Too
often we decide you can either
867
00:52:45,600 --> 00:52:49,860
be for it or against it. You
know, when the folks that had
868
00:52:49,860 --> 00:52:53,700
radios heard about TVs, I'm sure
they were like, yeah, that TV
869
00:52:53,700 --> 00:52:58,320
thing, it's probably not going
to take off. They got TVs, and
870
00:52:58,320 --> 00:53:03,210
they kept their radios too. So
we have to understand that you
871
00:53:03,210 --> 00:53:06,330
can actually keep two thoughts
in your mind at the same time,
872
00:53:06,450 --> 00:53:10,650
you actually can continue to
advance and be built on the
873
00:53:10,650 --> 00:53:14,400
foundations of really
understanding how the world
874
00:53:14,400 --> 00:53:18,090
works. It's not either or, and
we've got to be able to expand
875
00:53:18,090 --> 00:53:22,140
capacity, we've got to be able
to give people the resources are
876
00:53:22,140 --> 00:53:26,610
particularly under resourced
institutions, the opportunity to
877
00:53:26,640 --> 00:53:31,470
engage and to learn and to grow,
and to get with it, because if
878
00:53:31,470 --> 00:53:35,190
the country is going to move
forward, we we really don't have
879
00:53:35,190 --> 00:53:37,230
any educational institution to
leave.
880
00:53:38,700 --> 00:53:42,510
KiKi L'Italien: Yes, I mean, oh,
my gosh, this is this has been
881
00:53:42,540 --> 00:53:50,940
such a great, great show. I want
to end on a high note, and I see
882
00:53:50,940 --> 00:53:54,660
this great question. Actually,
that comes from Peggy
883
00:53:54,660 --> 00:53:57,780
Lambertson. So if you have the
time, just answer one question,
884
00:53:57,780 --> 00:54:02,220
folks. This is Audrey, do you
think the pool of knowledge
885
00:54:02,220 --> 00:54:06,330
feeding AI is at risk for cyber
attack? There might be pollution
886
00:54:06,330 --> 00:54:10,320
of online knowledge. Okay, so I
get this. This is about talking
887
00:54:10,320 --> 00:54:14,160
about, you know, where is the
data coming from? Can we trust
888
00:54:14,160 --> 00:54:19,110
where it's the pool of knowledge
that that generative AI is is
889
00:54:19,140 --> 00:54:22,530
pulling from for the information
that's giving us?
890
00:54:24,150 --> 00:54:26,340
Audrey Ellis: Sure, I mean, I
think for all just be
891
00:54:26,340 --> 00:54:29,520
transparent. I am no
cybersecurity expert. So take
892
00:54:29,520 --> 00:54:33,840
everything I say with a grain of
salt. But I think that in any
893
00:54:33,870 --> 00:54:40,140
space right now, where we have
documented information that is
894
00:54:40,140 --> 00:54:43,980
feeding into a tool, you said it
yourself very well, garbage in,
895
00:54:43,980 --> 00:54:48,000
garbage out. That's exactly how
I'm thinking about this. And so
896
00:54:48,180 --> 00:54:52,020
really important for us to be
thinking about any the quality
897
00:54:52,290 --> 00:54:57,600
of any data that we're using to
train these systems, and asking
898
00:54:57,600 --> 00:55:01,650
those critical questions as
consumers have, what data are
899
00:55:01,650 --> 00:55:06,510
you using to? You know, give me
this product or service? And
900
00:55:06,510 --> 00:55:10,140
also, what data are you
collecting for me? So really
901
00:55:10,140 --> 00:55:12,900
important question. And I think,
you know, we might get to a
902
00:55:12,900 --> 00:55:16,290
point you might have noticed on
Google Now, you search and
903
00:55:16,290 --> 00:55:21,360
there's like an AI generated sub
summary, before you even you
904
00:55:21,360 --> 00:55:25,020
know, get to the results. And
for all, you know, we're gonna
905
00:55:25,020 --> 00:55:28,380
end up at a time where AI is
training AI is training AI. And
906
00:55:28,380 --> 00:55:30,990
so like, I still think we're
really early in this space. And
907
00:55:30,990 --> 00:55:34,650
we need to think about, again,
how are we shaping it, and also
908
00:55:34,650 --> 00:55:38,730
whose voices stories
perspectives, knowledge is even
909
00:55:38,970 --> 00:55:43,410
included, for better or for
worse, looking at the history of
910
00:55:43,410 --> 00:55:49,950
kind of many disparate use cases
of tech and equity. But there's
911
00:55:49,950 --> 00:55:54,450
a lot of a lot of information
and knowledge to come out here.
912
00:55:54,990 --> 00:55:58,470
And I will just sound the equity
piece. There's a great book,
913
00:55:58,500 --> 00:56:01,980
unmasking AI, if you're
interested in learning more
914
00:56:01,980 --> 00:56:09,120
about equity and AI, especially
in terms of biometric data. So
915
00:56:09,150 --> 00:56:12,420
like facial recognition, things
like that. Highly recommend,
916
00:56:13,470 --> 00:56:15,510
Michael Baston: what once again,
it's all also about due
917
00:56:15,510 --> 00:56:19,800
diligence. So if we tell our
students that AI is like you're
918
00:56:19,950 --> 00:56:23,190
an intern or you know, an
assistant, you get the you can't
919
00:56:23,190 --> 00:56:27,840
turn in to your boss work, you
didn't check. So it is important
920
00:56:27,840 --> 00:56:32,220
for us to make sure that we
encourage everyone that is
921
00:56:32,220 --> 00:56:35,370
utilizing this information to do
the due diligence you got to
922
00:56:35,370 --> 00:56:35,760
check.
923
00:56:37,440 --> 00:56:40,110
KiKi L'Italien: Well, I just
can't thank you enough for
924
00:56:40,140 --> 00:56:44,520
taking the time to talk with us
today. Audrey, thank you. I
925
00:56:44,550 --> 00:56:49,050
first talked, I first spoke with
Audrey, she was able to bring
926
00:56:49,050 --> 00:56:53,580
Charles and bring Michael into
this discussion. So I'm very
927
00:56:53,580 --> 00:56:56,310
thankful for that. I think it
was a really meaningful
928
00:56:56,310 --> 00:57:00,360
conversation. And I hope that
you all will be open to maybe me
929
00:57:00,360 --> 00:57:03,120
reaching out for a follow up
conversation one of these days
930
00:57:03,120 --> 00:57:07,140
to find out as we go along.
What's the story now? What's
931
00:57:07,140 --> 00:57:08,310
happened since then?
932
00:57:08,850 --> 00:57:10,260
Charles Ansell: Absolutely.
We'll have a lot to show
933
00:57:10,260 --> 00:57:10,800
anytime.
934
00:57:12,330 --> 00:57:13,980
KiKi L'Italien: Thank you so
much, everyone. I really
935
00:57:13,980 --> 00:57:17,580
appreciate it. And thanks to all
of you for watching today. What
936
00:57:17,580 --> 00:57:21,540
did you think? Tell me in the
comments. write back to me, tell
937
00:57:21,540 --> 00:57:24,720
me what you thought. What were
your takeaways? What What were
938
00:57:24,720 --> 00:57:27,630
the biggest pieces that stood
out to you that said, Ah, I
939
00:57:27,630 --> 00:57:30,090
never thought about it that way.
But now I'm going to do
940
00:57:30,090 --> 00:57:33,450
something different because of
this. Whether it's something
941
00:57:33,450 --> 00:57:36,570
that's going to help you to
explore further ask more
942
00:57:36,570 --> 00:57:40,470
questions. I hope that you will
stay curious and keep asking
943
00:57:40,470 --> 00:57:44,580
questions every day, especially
when you're scared because as
944
00:57:44,580 --> 00:57:49,980
Joseph Campbell once said, The
cave you fear to enter holds the
945
00:57:49,980 --> 00:57:55,110
treasure you seek. Have a great
rest of the week, everyone