All Webinars | L.A.B.S. #5

AI In Action: A New Era for Assessment and Business

Implement these insights to drive innovation and efficiency as AI is transforming assessments and business operations.

Ready to revolutionize your approach to using AI? In this dynamic discussion led by industry experts at the forefront of AI implementation, we explore AI's transformative power in assessment and business. 

 

Discover how AI can revolutionize your business and drive innovation in real-world scenarios. Whether you're a seasoned industry professional or new to the field, this webinar will provide actionable insights into leveraging AI for maximum benefits for your testing program. Sandy Hogg, our Manager of Business Development, is joined by industry experts at the forefront of AI implementation, including: 

 

-Pat Ward (President & CEO, ITS)

-Kara McWilliams (former VP of Product Innovation and Development, ETS)

-Kimberly Swygert (Director of Test Development Innovations, NBME)

-Andre Allen (Owner & Executive VP, FifthTheory, LLC) 

+

Interested in partnering on a webinar? Share your ideas at webinars@testsys.com. 

1
00:00:07.885 --> 00:00:09.235
Hello there, and thank you so much

2
00:00:09.235 --> 00:00:10.515
for joining us for today's webinar.

3
00:00:11.255 --> 00:00:13.515
We are going to give everyone a few minutes to join,

4
00:00:13.535 --> 00:00:15.795
so we'll get started around 12:03.

5
00:00:16.575 --> 00:00:18.275
Uh, but in the meantime, I'm going

6
00:00:18.275 --> 00:00:20.315
to be launching a poll for you all to answer.

7
00:00:20.935 --> 00:00:22.435
So one second

8
00:00:24.185 --> 00:00:29.035
and I will get our

9
00:00:29.035 --> 00:00:30.195
poll up and running.

10
00:00:38.585 --> 00:00:40.675
Hold on. I may need to stop sharing in order to do that.

11
00:00:40.965 --> 00:00:44.305
Polls launch. Alright.

12
00:00:44.325 --> 00:00:46.385
So we are curious to know which application

13
00:00:46.385 --> 00:00:48.905
of AI do you believe holds the most promise

14
00:00:49.325 --> 00:00:50.385
for your organization.

15
00:00:50.605 --> 00:00:51.785
So as you're getting in

16
00:00:51.785 --> 00:00:54.345
and you're getting settled, please do, uh,

17
00:00:54.685 --> 00:00:56.985
select the best option that applies to you

18
00:00:57.865 --> 00:01:00.005
and it'll help, uh, give us a sense of what kind

19
00:01:00.005 --> 00:01:01.365
of audience we're working with today.

20
00:01:51.325 --> 00:01:53.585
And I see we have 78% participation

21
00:01:53.685 --> 00:01:55.065
so far, so that's fantastic.

22
00:01:55.285 --> 00:01:56.425
So as we're joining, we're gonna

23
00:01:56.425 --> 00:01:57.505
get started in just a minute.

24
00:01:58.005 --> 00:01:59.545
Uh, if you have a moment to answer our survey

25
00:01:59.825 --> 00:02:02.055
question, that'd be fantastic.

26
00:02:24.205 --> 00:02:27.015
Okay. We just jumped up to 86% participation

27
00:02:27.015 --> 00:02:28.175
and we're at 12:03.

28
00:02:28.395 --> 00:02:30.775
So I am going to end the poll

29
00:02:30.915 --> 00:02:32.575
and share the results with you all.

30
00:02:33.585 --> 00:02:35.015
There is a clear winner here.

31
00:02:35.835 --> 00:02:37.495
It seems that most of you feel

32
00:02:37.495 --> 00:02:39.535
that improving operational efficiency is going

33
00:02:39.535 --> 00:02:41.295
to hold the most promise for your organization.

34
00:02:41.555 --> 00:02:45.175
So we will see if our panelists today can either reinforce

35
00:02:45.205 --> 00:02:46.375
that belief for you

36
00:02:46.595 --> 00:02:49.735
or perhaps make a case, uh, case for changing your mind.

37
00:02:52.865 --> 00:02:56.615
All right, so I'll get that out of here

38
00:02:58.085 --> 00:02:59.465
and we can go ahead and get get started.

39
00:03:01.545 --> 00:03:03.925
So, welcome everyone. My name is Sandy Hogg.

40
00:03:03.925 --> 00:03:06.565
I'm the Manager of Business Development at ITS,

41
00:03:07.105 --> 00:03:09.365
and I'll be your moderator during today's webinar.

42
00:03:10.225 --> 00:03:12.325
So once again, thank you so much for joining us

43
00:03:12.385 --> 00:03:14.165
and we really appreciate your company today.

44
00:03:15.385 --> 00:03:17.365
The session will be covering today is going

45
00:03:17.365 --> 00:03:19.405
to delve into AI in action.

46
00:03:19.865 --> 00:03:22.885
So we've assembled a diverse panel of experts that are eager

47
00:03:22.945 --> 00:03:26.205
to share how their organizations are already leveraging AI

48
00:03:26.265 --> 00:03:27.925
to transform the business of assessment

49
00:03:28.425 --> 00:03:30.965
and also share their unique perspectives on how

50
00:03:31.385 --> 00:03:34.445
to think about integrating AI into your, uh,

51
00:03:34.725 --> 00:03:35.845
business or program strategy.

52
00:03:36.585 --> 00:03:39.845
So this will be a panel style webinar, don't panic at the,

53
00:03:39.945 --> 00:03:41.565
at the slide that is showing right now.

54
00:03:41.945 --> 00:03:44.285
Uh, so for the majority of the time it'll just be me

55
00:03:44.285 --> 00:03:45.365
and the panelists talking.

56
00:03:46.065 --> 00:03:47.845
But before we dive in, I do want

57
00:03:47.845 --> 00:03:50.125
to go over a few housekeeping items.

58
00:03:54.445 --> 00:03:57.865
So we have reserved dedicated time at the end

59
00:03:58.065 --> 00:03:59.585
of the webinar for Q and A.

60
00:04:00.285 --> 00:04:02.785
You can submit questions using the Zoom Q

61
00:04:02.785 --> 00:04:04.385
and A button that you should see at the bottom

62
00:04:04.525 --> 00:04:07.145
or top of your screen depending on your Zoom layout.

63
00:04:07.965 --> 00:04:10.905
You are welcome to submit questions throughout the webinar

64
00:04:11.085 --> 00:04:12.265
as the questions come to you.

65
00:04:12.965 --> 00:04:15.225
And we, there is an upvoting feature

66
00:04:15.365 --> 00:04:16.785
as well with the Q and A.

67
00:04:16.845 --> 00:04:19.705
So I do highly encourage you as questions come in,

68
00:04:19.725 --> 00:04:22.705
if you also find it interesting, you would also like to sort

69
00:04:22.705 --> 00:04:24.425
of vote for that question to surface

70
00:04:25.045 --> 00:04:26.385
to bubble to the top of our list.

71
00:04:26.485 --> 00:04:28.585
Uh, please do upvote questions as you see them

72
00:04:28.585 --> 00:04:31.065
because, uh, we will be using that at the end of the webinar

73
00:04:31.205 --> 00:04:34.745
to determine, uh, which questions answer first.

74
00:04:35.885 --> 00:04:37.985
Uh, this webinar will be recorded,

75
00:04:38.445 --> 00:04:40.025
so the recording will be available.

76
00:04:40.645 --> 00:04:44.505
Um, the, the link to access that recording will be available

77
00:04:44.505 --> 00:04:45.705
after the webinar is over

78
00:04:45.765 --> 00:04:48.065
and it will also be available from our website.

79
00:04:49.045 --> 00:04:50.505
And last

80
00:04:50.505 --> 00:04:52.945
but not least, there will be a survey at the end

81
00:04:52.945 --> 00:04:55.065
of the webinar, kind of in your Zoom window.

82
00:04:55.165 --> 00:04:56.785
So don't click, click away too quickly.

83
00:04:57.285 --> 00:04:59.705
Uh, we do wanna hear what you think about today's session

84
00:05:00.045 --> 00:05:02.385
and get your idea for future webinars as well.

85
00:05:03.085 --> 00:05:06.015
So that being said,

86
00:05:06.115 --> 00:05:08.295
I'm gonna stop sharing my screen and we're gonna get started.

87
00:05:08.515 --> 00:05:11.015
So I'm gonna ask today's panelists to turn on their cameras

88
00:05:11.355 --> 00:05:12.735
and join me.

89
00:05:17.275 --> 00:05:19.285
Perfect. Alright,

90
00:05:19.305 --> 00:05:21.605
so the very first thing I'm gonna do is we're gonna quickly

91
00:05:21.705 --> 00:05:24.525
ask today's panelists to go around the table

92
00:05:24.545 --> 00:05:27.645
and introduce themselves so that everyone knows who you are.

93
00:05:28.065 --> 00:05:29.485
Um, I'm gonna start with Kara first.

94
00:05:29.485 --> 00:05:31.805
Kara, could you introduce yourself? Sure.

95
00:05:32.145 --> 00:05:34.325
Hi Sandy. Thank you so much for having me here today.

96
00:05:34.465 --> 00:05:37.285
Hi everyone. Thanks for joining. My name's Kara McWilliams.

97
00:05:37.685 --> 00:05:39.765
I am the Vice President of Product Innovation

98
00:05:39.825 --> 00:05:41.565
and Development at ETS.

99
00:05:41.745 --> 00:05:44.645
My background is in psychometrics cognitive

100
00:05:44.645 --> 00:05:45.965
science and ed tech.

101
00:05:46.105 --> 00:05:48.845
And my passion is really about using advanced technology

102
00:05:48.945 --> 00:05:51.085
to help folks, um, progress in life.

103
00:05:52.255 --> 00:05:56.875
Great, thank you. Andre. Sandy, thanks for

104
00:05:56.875 --> 00:05:57.875
having me. Uh, my name is

105
00:05:57.875 --> 00:05:58.455
Andre Allen.

106
00:05:58.715 --> 00:06:00.775
Uh, I'm with a company called Fifth Theory.

107
00:06:01.195 --> 00:06:03.735
I'm also a member of the Association of Test Publishers

108
00:06:04.195 --> 00:06:05.455
and I sit on the Board of Human

109
00:06:05.655 --> 00:06:07.015
Resources Certification Institute.

110
00:06:07.325 --> 00:06:09.615
Been in the assessment industry now for about 30 years.

111
00:06:10.705 --> 00:06:13.655
Great. Thank you so much for joining us. Andre. Uh, pat,

112
00:06:14.945 --> 00:06:15.945
Uh, hi everyone.

113
00:06:15.945 --> 00:06:18.685
I'm Pat Ward, I'm the president and CEO

114
00:06:18.705 --> 00:06:20.085
and I guess original founder

115
00:06:20.185 --> 00:06:22.365
of Internet Testing Systems or ITS.

116
00:06:22.785 --> 00:06:26.405
Um, my background is, uh, technology software development

117
00:06:26.625 --> 00:06:29.565
and um, um, I'm thrilled to be here with this group.

118
00:06:29.825 --> 00:06:32.645
Um, a panelists, uh, some, uh, uh, it's been,

119
00:06:32.645 --> 00:06:34.085
it's been fun actually doing our prep

120
00:06:34.085 --> 00:06:35.645
sessions, getting to know him a little bit.

121
00:06:35.745 --> 00:06:37.165
So it's gonna be a great session.

122
00:06:38.455 --> 00:06:40.885
Great. Thanks Pat. And Kimberly, last but not least.

123
00:06:41.265 --> 00:06:42.405
Yes, thank you so much.

124
00:06:42.625 --> 00:06:45.525
Um, I'm Kimberly Swygert, I'm a psychometrician

125
00:06:45.625 --> 00:06:48.165
and I'm also the Director of Test Development Innovations

126
00:06:48.165 --> 00:06:49.685
for the National Board of Medical Examiners.

127
00:06:50.315 --> 00:06:52.925
I've been there for about 22 years in the assessment field

128
00:06:52.945 --> 00:06:56.085
for about 24, or actually about 26 at this point.

129
00:06:56.785 --> 00:06:59.725
And so I lead the innovations team within test development

130
00:06:59.725 --> 00:07:03.525
where we are very excited about the use of new technology,

131
00:07:03.525 --> 00:07:06.445
including AI to support all sorts of, um,

132
00:07:06.495 --> 00:07:07.885
tasks within test development.

133
00:07:08.465 --> 00:07:11.645
And I've been interested in AI since about late 2021

134
00:07:12.275 --> 00:07:14.885
when we began having some internal discussions about the

135
00:07:14.885 --> 00:07:17.645
potential for AI to support work done at the National Board.

136
00:07:18.105 --> 00:07:19.105
So thank you.

137
00:07:20.305 --> 00:07:23.585
Excellent. Alright, so we're gonna kick things off

138
00:07:23.585 --> 00:07:25.585
with a round table, uh, question.

139
00:07:25.615 --> 00:07:27.145
Something I want each of you to answer,

140
00:07:27.245 --> 00:07:29.425
and I'd like to keep it pretty basic upfront.

141
00:07:30.325 --> 00:07:32.705
Uh, so what I'd like to know in about a minute,

142
00:07:33.005 --> 00:07:36.305
can you share with us what AI means to you?

143
00:07:36.765 --> 00:07:39.225
And I'll start back around in the same order with Kara.

144
00:07:40.315 --> 00:07:42.405
Sure. Um, actually Sandy, I think

145
00:07:42.405 --> 00:07:44.045
that's an incredibly complex question.

146
00:07:44.185 --> 00:07:45.365
Not really a basic question,

147
00:07:45.385 --> 00:07:46.805
but I'm glad we're kicking off with this one.

148
00:07:47.405 --> 00:07:51.045
I actually think I'd like to start with what AI isn't to me.

149
00:07:51.265 --> 00:07:53.685
And to me AI isn't a solution.

150
00:07:53.965 --> 00:07:56.485
I think a lot of people think that AI is a solution,

151
00:07:56.625 --> 00:07:59.285
and in fact, to me, AI is a tool.

152
00:07:59.715 --> 00:08:01.485
It's an incredibly powerful tool.

153
00:08:01.585 --> 00:08:03.605
Um, but it's another tool in our toolbox

154
00:08:03.665 --> 00:08:06.365
to help serve the users who are trying to support.

155
00:08:06.905 --> 00:08:10.685
And to me, it's really about bringing AI together with

156
00:08:10.685 --> 00:08:12.245
what we know about how people learn

157
00:08:12.265 --> 00:08:14.205
and demonstrate what they know most effectively

158
00:08:14.385 --> 00:08:16.245
and deliver it to people in meaningful

159
00:08:16.265 --> 00:08:17.445
and, um, delightful ways.

160
00:08:17.705 --> 00:08:20.205
So to me, AI is not a solution,

161
00:08:20.265 --> 00:08:22.765
but is an incredibly powerful tool we can use

162
00:08:22.765 --> 00:08:23.765
to help the learners we serve.

163
00:08:24.735 --> 00:08:26.195
That's great. I like how you use the

164
00:08:26.195 --> 00:08:27.235
word delightful there.

165
00:08:27.235 --> 00:08:30.835
Deliver it in delightful ways. Um, Andre, what's your take?

166
00:08:30.865 --> 00:08:32.755
What is, uh, what, what does AI mean to you?

167
00:08:33.425 --> 00:08:35.715
Well, Sandy, I'm gonna keep the positive vibes with Kara,

168
00:08:36.135 --> 00:08:37.555
uh, the way she described it

169
00:08:37.655 --> 00:08:40.675
and I am in violent agreement, uh, with what she stated.

170
00:08:41.065 --> 00:08:43.195
I'll just add three words. Um, opportunity,

171
00:08:43.455 --> 00:08:44.835
access and adventure.

172
00:08:45.415 --> 00:08:48.355
Uh, for those that are willing to ride the surfboard of AI,

173
00:08:48.735 --> 00:08:50.915
um, I think there's going to be incredible opportunity

174
00:08:51.015 --> 00:08:52.875
to do a lot of things, a lot of great things.

175
00:08:53.035 --> 00:08:56.325
I think it's gonna open up access, uh, for people to be able

176
00:08:56.325 --> 00:08:57.765
to demonstrate their skill and,

177
00:08:57.825 --> 00:08:59.765
and for us to be able to evaluate things

178
00:08:59.765 --> 00:09:01.365
that we weren't able to evaluate before.

179
00:09:01.425 --> 00:09:02.645
And I think it's gonna be an adventure

180
00:09:02.875 --> 00:09:05.045
because I think we are going to learn things

181
00:09:05.045 --> 00:09:06.845
that we didn't expect to learn.

182
00:09:07.485 --> 00:09:10.645
I think every uh, sort of innovative period in our time, uh,

183
00:09:10.675 --> 00:09:13.125
created some learnings that we just didn't expect to learn.

184
00:09:13.385 --> 00:09:16.645
You know, several years ago, um, you, you know, when I went

185
00:09:16.645 --> 00:09:18.645
to school, you had to be smart for real, meaning you had

186
00:09:18.645 --> 00:09:20.165
to memorize a bunch of things,

187
00:09:20.615 --> 00:09:21.615
Right?

188
00:09:21.825 --> 00:09:24.245
And now it's more about do you have the ability

189
00:09:24.305 --> 00:09:26.925
to write the ask, to ask the right questions at the right

190
00:09:26.925 --> 00:09:28.445
time and search for the right information?

191
00:09:28.465 --> 00:09:30.125
That's what defines intelligence today.

192
00:09:30.185 --> 00:09:32.125
So I think it'll create that same sort

193
00:09:32.125 --> 00:09:33.765
of change in how we view things.

194
00:09:34.675 --> 00:09:36.465
Great. How about you Pat?

195
00:09:36.465 --> 00:09:37.785
Are you on the positive side of this?

196
00:09:38.645 --> 00:09:42.305
Oh yeah, I am. Uh, you know, it, it's nice following Andre

197
00:09:42.305 --> 00:09:44.025
and Kimberly 'cause they're so eloquent in,

198
00:09:44.025 --> 00:09:45.505
in the, in how they phrased.

199
00:09:45.505 --> 00:09:46.625
And I agree with everything they said.

200
00:09:46.785 --> 00:09:49.905
I, I would start with just like, what a cool technology.

201
00:09:50.385 --> 00:09:52.745
I mean, you know, just from a technology standpoint,

202
00:09:53.545 --> 00:09:54.585
watching what it does.

203
00:09:54.645 --> 00:09:59.385
And I remember back in, you know, that December, 2022 right?

204
00:09:59.415 --> 00:10:01.505
When chat GPT came out and

205
00:10:01.805 --> 00:10:03.185
You know, what's interesting?

206
00:10:03.445 --> 00:10:06.825
As cool as it seemed right then it's even more cool now.

207
00:10:06.945 --> 00:10:09.585
I mean, it's it's one of those hypes that, you know,

208
00:10:09.845 --> 00:10:12.585
it actually lived up more and it, it does even more.

209
00:10:12.765 --> 00:10:14.865
Um, and I'm, I'm, I love when watching it.

210
00:10:15.065 --> 00:10:17.665
I, I love all the things that are possible.

211
00:10:17.665 --> 00:10:19.665
There are things I'm scared about, like cheating.

212
00:10:19.675 --> 00:10:21.425
We'll talk a little bit about more of that later.

213
00:10:21.965 --> 00:10:24.545
Um, but it, it's just gonna get better, right?

214
00:10:24.845 --> 00:10:26.265
And, um, it's just fun.

215
00:10:26.375 --> 00:10:28.105
It's so much fun to be a part of this.

216
00:10:28.605 --> 00:10:32.145
Um, I, you know, just in my, you know, my lifetime,

217
00:10:32.245 --> 00:10:34.585
you know, actually I can go back to the microcomputer

218
00:10:34.845 --> 00:10:36.105
and I remember when that came out.

219
00:10:36.105 --> 00:10:37.985
And then the internet, you know, and mobile phones

220
00:10:38.045 --> 00:10:40.705
and this is like the latest really great thing.

221
00:10:40.705 --> 00:10:43.345
And it, it is, it is a lot of fun. It. Great.

222
00:10:44.465 --> 00:10:47.685
And Kimberly, Yeah, uh, on a related note,

223
00:10:47.835 --> 00:10:48.885
what I think it means is

224
00:10:48.885 --> 00:10:52.645
that our work in psychometrics is not like taking place in a

225
00:10:52.645 --> 00:10:54.325
bubble or behind walls any longer.

226
00:10:54.425 --> 00:10:55.725
So as someone who's been kind

227
00:10:55.725 --> 00:10:57.885
of very deep within testing organizations

228
00:10:57.885 --> 00:10:59.885
for their whole career, um, always

229
00:10:59.885 --> 00:11:01.685
before, even if we came up with something new

230
00:11:01.685 --> 00:11:04.725
that we would use for an assessment, um, we would kind

231
00:11:04.725 --> 00:11:07.365
of just be doing that internally until it was time to reveal

232
00:11:07.365 --> 00:11:09.285
that to our stakeholders, our examinees.

233
00:11:09.385 --> 00:11:11.525
And we didn't necessarily have to have

234
00:11:11.525 --> 00:11:13.245
that wide a conversation about it.

235
00:11:13.265 --> 00:11:16.285
And now the conversation is literally as wide as it can get

236
00:11:16.395 --> 00:11:19.085
because of things like what Pat just mentioned with

237
00:11:19.765 --> 00:11:21.325
seemingly everyone on earth finding out about

238
00:11:21.325 --> 00:11:22.605
chat GPT at the same time.

239
00:11:23.265 --> 00:11:28.005
Um, I was in the office in January of 2023 and for one day,

240
00:11:28.265 --> 00:11:30.605
and I heard chat GPT mentioned in every

241
00:11:30.605 --> 00:11:31.685
single meeting that day.

242
00:11:31.825 --> 00:11:33.165
And then when I left the office

243
00:11:33.305 --> 00:11:35.165
and was waiting for friends in the lobby,

244
00:11:35.705 --> 00:11:38.245
the security guard had a local radio station on

245
00:11:38.265 --> 00:11:41.725
and the DJ started talking about how about Chat GPT

246
00:11:42.225 --> 00:11:44.685
and how it might help students like cheat on their

247
00:11:44.685 --> 00:11:45.765
school assignments or whatever.

248
00:11:45.865 --> 00:11:49.125
And I was like, this is literally everywhere at once.

249
00:11:49.585 --> 00:11:51.525
And that's not necessarily

250
00:11:51.625 --> 00:11:53.085
how our discussions have taken place

251
00:11:53.320 --> 00:11:54.405
before in terms of

252
00:11:54.405 --> 00:11:55.765
how we're thinking about incorporating

253
00:11:56.015 --> 00:11:57.445
technology into the exam.

254
00:11:57.545 --> 00:12:00.285
So it's really interesting that it's gone from a

255
00:12:00.285 --> 00:12:01.405
what are we gonna do going forward

256
00:12:01.625 --> 00:12:04.245
to now every stakeholder may saying,

257
00:12:04.465 --> 00:12:06.245
how are you gonna use AI going forward?

258
00:12:06.245 --> 00:12:07.245
Mm-Hmm. And what you create.

259
00:12:08.195 --> 00:12:09.835
Hmm. Definitely.

260
00:12:11.475 --> 00:12:12.885
Alright, well I'm gonna keep it going

261
00:12:12.885 --> 00:12:14.645
with another round table because you all have a lot

262
00:12:14.645 --> 00:12:16.205
of really interesting things to say here.

263
00:12:16.625 --> 00:12:19.325
So, you know, I think we're all really curious to know

264
00:12:19.555 --> 00:12:21.205
what you all are seeing in each

265
00:12:21.205 --> 00:12:23.165
of your individual organizations today

266
00:12:23.795 --> 00:12:25.965
regarding the transformative power of ai.

267
00:12:26.145 --> 00:12:28.285
Um, and you know, for example,

268
00:12:28.285 --> 00:12:30.845
we hear a lot about automated item generation right now,

269
00:12:30.985 --> 00:12:34.205
so I am curious about some of the other ways outside

270
00:12:34.205 --> 00:12:35.485
of just strictly that, that you,

271
00:12:35.795 --> 00:12:37.765
that your organizations are using AI

272
00:12:37.905 --> 00:12:41.205
or looking to use AI as an efficiency driver,

273
00:12:41.845 --> 00:12:43.805
creative catalyst, analytical tool.

274
00:12:44.425 --> 00:12:46.865
Um, so yeah, I'd love to hear it.

275
00:12:46.885 --> 00:12:48.305
I'm gonna start with Pat on this one.

276
00:12:49.085 --> 00:12:50.785
You know, I meant to mention last time

277
00:12:50.805 --> 00:12:52.705
and I thought the survey was impossible

278
00:12:52.705 --> 00:12:55.465
because how do I, how do I choose from,

279
00:12:55.855 --> 00:12:58.265
from those answers on, uh, you know, and,

280
00:12:58.285 --> 00:12:59.945
and I could have put something in other too.

281
00:13:00.165 --> 00:13:02.665
Um, Mm-Hmm. You know, when it, when it, you know, going back

282
00:13:02.665 --> 00:13:04.105
to that, that moment in December,

283
00:13:04.205 --> 00:13:05.785
and you know, I, I knew that,

284
00:13:05.885 --> 00:13:06.945
you know, the world had changed.

285
00:13:07.405 --> 00:13:10.985
Um, you know, I was encouraging everybody in our company

286
00:13:11.085 --> 00:13:14.425
to start using it, you know, for things in the company just

287
00:13:14.425 --> 00:13:16.185
so we would get experience with it, right?

288
00:13:16.685 --> 00:13:18.945
Mm-Hmm. But, but we also put in place, um,

289
00:13:19.015 --> 00:13:21.705
some initiatives from a development perspective right away.

290
00:13:22.125 --> 00:13:24.585
And we did, um, we, we've actually branded it,

291
00:13:24.585 --> 00:13:25.865
it's called SparkAI,

292
00:13:25.965 --> 00:13:28.865
and it's meant to basically be more than just generating

293
00:13:28.935 --> 00:13:30.025
item content, right?

294
00:13:30.405 --> 00:13:32.785
But with that said, our very first release was

295
00:13:32.785 --> 00:13:34.745
for our item bank, and we actually released, you know,

296
00:13:34.745 --> 00:13:35.985
generating items, right?

297
00:13:35.985 --> 00:13:38.625
That was, that was actually where we, where we started.

298
00:13:39.085 --> 00:13:43.665
Um, you know, we also did a, a tool, which we call, um, uh,

299
00:13:44.135 --> 00:13:47.945
it's, it's a SparkAI playground, which is, it's related

300
00:13:47.945 --> 00:13:49.385
to item content generation,

301
00:13:49.405 --> 00:13:51.585
but we, one of the things we did was we, we hooked up

302
00:13:51.585 --> 00:13:53.945
of all the different vendors, not just, uh, you know, Open,

303
00:13:54.015 --> 00:13:57.225
Open AI, but, you know, all like the Llama one

304
00:13:57.245 --> 00:13:59.905
and you know, the, uh, the stuff coming outta Microsoft

305
00:13:59.905 --> 00:14:01.305
and AWS and things like that.

306
00:14:01.885 --> 00:14:04.745
And we created a playground, um, that

307
00:14:04.785 --> 00:14:07.065
where they could basically put in a single prompt

308
00:14:07.165 --> 00:14:09.185
and then see how a whole bunch

309
00:14:09.185 --> 00:14:11.185
of different models would generate it simultaneously.

310
00:14:11.405 --> 00:14:13.585
So you could see how it works, just so we could kind

311
00:14:13.585 --> 00:14:14.945
of help learn, you know, what was,

312
00:14:15.015 --> 00:14:16.145
what, how things work, Mm-Hmm.

313
00:14:16.265 --> 00:14:18.625
So we, we've made that available to our customers as well.

314
00:14:19.045 --> 00:14:21.785
Um, on the security front, we did some stuff with, um,

315
00:14:22.005 --> 00:14:24.545
you know, the, uh, what we call proctor assist,

316
00:14:24.545 --> 00:14:25.705
but where we're watching

317
00:14:25.705 --> 00:14:27.905
what candidates are doing in a remote proctor setting

318
00:14:28.365 --> 00:14:29.985
and alerting them if they pick up a phone

319
00:14:29.985 --> 00:14:31.825
or somebody starts talking or things like that.

320
00:14:32.285 --> 00:14:33.865
But what I'm really excited about

321
00:14:34.085 --> 00:14:36.705
and where we're working on is, uh, where we're starting

322
00:14:36.705 --> 00:14:38.425
to bring an analysis to the whole thing.

323
00:14:38.725 --> 00:14:42.065
And, um, that's, that's everything from, well,

324
00:14:42.065 --> 00:14:44.305
the starting point is like taking survey responses,

325
00:14:44.305 --> 00:14:45.345
open-ended questions.

326
00:14:45.525 --> 00:14:47.865
That's the one that's gonna come out, uh, very soon

327
00:14:48.035 --> 00:14:49.945
where you can kind of get an analysis, a summary

328
00:14:50.005 --> 00:14:51.365
of all the, the responses.

329
00:14:51.425 --> 00:14:53.045
But it might be looking at an item

330
00:14:53.145 --> 00:14:55.685
that's discovering bias in the item, or, or, mm-Hmm.

331
00:14:55.765 --> 00:14:57.565
Is the item written properly? Mm.

332
00:14:57.625 --> 00:15:00.605
Uh, we're, we're using it to kind of improve the, you know,

333
00:15:00.665 --> 00:15:04.245
the traditional, um, natural language processing

334
00:15:04.245 --> 00:15:05.445
for item enemy detection.

335
00:15:05.705 --> 00:15:07.165
But that's actually how I met Kimberly.

336
00:15:07.285 --> 00:15:08.565
'cause I went to her as an expert on that

337
00:15:08.585 --> 00:15:09.685
and asked her questions.

338
00:15:10.025 --> 00:15:12.045
Mm-Hmm. And she is, by the way, an expert

339
00:15:12.065 --> 00:15:13.605
and she gave me all sorts of good advice.

340
00:15:13.985 --> 00:15:15.605
And now we're bringing an AI to that,

341
00:15:15.665 --> 00:15:16.845
and it's actually getting better.

342
00:15:17.065 --> 00:15:20.005
So, um, you know, it's gonna, it's not going to stop.

343
00:15:20.235 --> 00:15:22.445
It's, it's going to be an ongoing thing for years,

344
00:15:22.465 --> 00:15:24.885
but I, it's the analysis side that I'm most excited about

345
00:15:25.225 --> 00:15:27.125
and we'll have the most projects in the long run.

346
00:15:28.085 --> 00:15:31.985
Great. So Kimberly, nice segue to you then.

347
00:15:32.245 --> 00:15:34.585
Yes. And thank you so much, Pat. I appreciate it.

348
00:15:35.045 --> 00:15:38.385
Um, so if anyone is just starting out with AI,

349
00:15:38.565 --> 00:15:40.745
I'm gonna recommend a book called The Business Case

350
00:15:40.765 --> 00:15:41.905
for AI by Dr.

351
00:15:42.225 --> 00:15:44.705
Sison. And I am a little biased towards it, but that's

352
00:15:44.705 --> 00:15:47.665
because I read it after we had started down our AI path.

353
00:15:48.245 --> 00:15:50.385
And the way that it says you should do AI is

354
00:15:50.385 --> 00:15:51.505
the way that we actually did AI.

355
00:15:51.725 --> 00:15:53.825
So again, uh, that was nice to hear.

356
00:15:54.445 --> 00:15:56.545
But there's three points I would make related

357
00:15:56.545 --> 00:15:58.665
to advice from the book, and they overlap a little bit with

358
00:15:58.665 --> 00:16:00.345
what Pat has just mentioned in terms of

359
00:16:00.345 --> 00:16:02.545
how you're gonna think about ways to use AI

360
00:16:02.545 --> 00:16:04.585
that might not just be item generation.

361
00:16:05.245 --> 00:16:07.305
So the first point is to start small,

362
00:16:07.365 --> 00:16:09.945
and the second is to identify the pain points

363
00:16:10.285 --> 00:16:12.025
and then look for the AI solutions.

364
00:16:12.725 --> 00:16:14.385
And that might sound a little bit obvious,

365
00:16:14.485 --> 00:16:15.945
but going back to our point about

366
00:16:15.945 --> 00:16:18.505
how everyone in the world found out about generative AI,

367
00:16:18.505 --> 00:16:19.825
at the same time, um,

368
00:16:19.825 --> 00:16:21.265
if you don't already have a plan in place

369
00:16:21.285 --> 00:16:23.665
for using AI in any of these capacities

370
00:16:23.845 --> 00:16:26.425
or an idea of where to start, it's really easy

371
00:16:26.425 --> 00:16:28.345
to get overwhelmed if stakeholders come to you

372
00:16:28.525 --> 00:16:31.465
and say, we need to start using AI, maybe for the sake

373
00:16:31.465 --> 00:16:33.705
of using AI, or so that everybody doesn't think we're

374
00:16:33.705 --> 00:16:34.785
behind and not using AI.

375
00:16:34.885 --> 00:16:37.105
So that's a, that's a dangerous trap to fall into.

376
00:16:37.805 --> 00:16:41.025
But in particular for the adding efficiencies part, that is

377
00:16:41.025 --> 00:16:44.465
where we have had great success, starting very small,

378
00:16:44.855 --> 00:16:46.865
leaning heavily on internal data mining

379
00:16:47.005 --> 00:16:48.545
for specific types of tasks.

380
00:16:48.725 --> 00:16:52.025
And in our case, that meant automating text matching

381
00:16:52.085 --> 00:16:54.785
and automating, um, item enemy coding.

382
00:16:55.045 --> 00:16:57.305
As Pat mentioned, they've also developed something for,

383
00:16:57.885 --> 00:16:58.985
and that's where we found that

384
00:16:58.985 --> 00:17:01.025
because we have large item banks, um,

385
00:17:01.135 --> 00:17:02.985
they're being worked on by many authors,

386
00:17:03.095 --> 00:17:05.865
some are on highly specific medical topics

387
00:17:05.935 --> 00:17:08.585
with very specialized and standardized language.

388
00:17:09.165 --> 00:17:11.985
Um, you know, automated enemy flagging was a success

389
00:17:11.985 --> 00:17:13.265
for us using NLP

390
00:17:13.535 --> 00:17:15.745
because it was just one limited application.

391
00:17:16.125 --> 00:17:18.065
And the key was that we took away a task

392
00:17:18.255 --> 00:17:19.745
that the humans hated doing.

393
00:17:20.465 --> 00:17:21.945
So that's where you want to look

394
00:17:21.945 --> 00:17:24.625
for is not only is the pain point something you want

395
00:17:24.625 --> 00:17:28.025
to improve, but you'll have much greater success

396
00:17:28.025 --> 00:17:30.265
with change management if you find a point where this,

397
00:17:30.275 --> 00:17:31.825
where the humans will say, yes,

398
00:17:31.825 --> 00:17:33.585
please take this, this task away from us.

399
00:17:33.585 --> 00:17:35.585
We'd like this to be automated. Right.

400
00:17:35.615 --> 00:17:38.195
Um, and then also related to a point that,

401
00:17:38.195 --> 00:17:40.355
that Pat made about potential language

402
00:17:40.415 --> 00:17:44.995
or bias, um, we had an intern in 2021 named Swati Padhee,

403
00:17:45.135 --> 00:17:48.195
who was an amazing person getting a PhD in computer

404
00:17:48.195 --> 00:17:49.205
science and AI.

405
00:17:49.745 --> 00:17:52.125
And she ran some fairly sophisticated NLP

406
00:17:52.125 --> 00:17:55.085
and machine learning techniques on our item banks

407
00:17:55.115 --> 00:17:57.165
that allowed us to identify unexpected

408
00:17:57.225 --> 00:18:00.285
or unintentional associations among the item language.

409
00:18:00.865 --> 00:18:03.325
And so that was published as part of conference proceedings,

410
00:18:03.745 --> 00:18:05.925
but we realized that that point, we had a tool

411
00:18:05.925 --> 00:18:09.045
that if we really wanted to open up some interesting

412
00:18:09.805 --> 00:18:13.285
questions related to, you know, unintentional, um,

413
00:18:13.745 --> 00:18:15.485
and undesirable language biases

414
00:18:16.025 --> 00:18:18.005
or support diversity initiatives,

415
00:18:18.025 --> 00:18:19.765
it could be extremely powerful for that.

416
00:18:19.825 --> 00:18:22.405
And in that case, it was doing something I don't even think

417
00:18:22.405 --> 00:18:25.125
you could have a human do if you didn't have the

418
00:18:25.305 --> 00:18:26.325
AI to support it.

419
00:18:26.905 --> 00:18:30.405
So, um, even if you're not using AI for content generation,

420
00:18:31.045 --> 00:18:33.125
I think you can find a lot of interesting applications

421
00:18:33.225 --> 00:18:35.165
to apply to the items that you already have.

422
00:18:36.625 --> 00:18:40.185
That's great. And, you know, curious if that runs into,

423
00:18:40.185 --> 00:18:42.545
or if that sort of circumvents the concerns with copyright

424
00:18:42.545 --> 00:18:44.105
because you're just using it as an internal

425
00:18:44.175 --> 00:18:45.705
tool, right? Yeah,

426
00:18:45.935 --> 00:18:48.015
Exactly Right.

427
00:18:48.865 --> 00:18:50.335
Andre, curious for your take on this.

428
00:18:50.995 --> 00:18:54.095
You know, um, we, you know, at Fifth Theory, um, I have

429
00:18:54.095 --> 00:18:57.215
to credit a lot of our advancements in how we utilize AI

430
00:18:57.215 --> 00:18:59.615
to Dr. Jack Jones, our chief scientist, um,

431
00:18:59.615 --> 00:19:03.255
because it was him that he really championed the use of it.

432
00:19:03.255 --> 00:19:05.255
And we use it more in terms of efficiency

433
00:19:05.315 --> 00:19:09.775
and creativity, um, sort of being that, um, seed catalyst

434
00:19:09.795 --> 00:19:12.055
to sort of get the creative, the creativity going.

435
00:19:12.595 --> 00:19:16.095
Um, but what we found is just using it as a tool internally.

436
00:19:16.455 --> 00:19:18.735
'cause we haven't extended it to our customers yet. Mm-Hmm.

437
00:19:18.815 --> 00:19:20.015
Uh, it's increased our efficiency.

438
00:19:20.635 --> 00:19:24.615
Um, and what's great about it is your ability to, you know,

439
00:19:24.615 --> 00:19:27.935
you already have a curated set of information about things

440
00:19:28.535 --> 00:19:30.375
adjacent to what you're actually looking for.

441
00:19:30.715 --> 00:19:34.815
So you're able to use outside resources, outside, um, bits

442
00:19:34.815 --> 00:19:38.695
of data as part of formulating, uh, either content creation.

443
00:19:38.695 --> 00:19:41.295
We've created courses with it, we've used it

444
00:19:41.295 --> 00:19:44.375
to generate sort of the starts of items, you know,

445
00:19:44.375 --> 00:19:45.775
that we may include in assessments.

446
00:19:46.155 --> 00:19:49.535
Um, so it's really been a great efficiency driver for us,

447
00:19:49.595 --> 00:19:51.375
and we're very, very bullish about it

448
00:19:51.375 --> 00:19:53.535
because we're looking for Okay, how can this be applied?

449
00:19:53.715 --> 00:19:56.095
Kim, to your point, um, the one thing

450
00:19:56.095 --> 00:19:58.455
that we've seen over the years is that when you start

451
00:19:58.805 --> 00:20:00.975
with a solution that's looking for a problem,

452
00:20:01.565 --> 00:20:03.135
then things never really get done.

453
00:20:03.325 --> 00:20:06.175
Yeah. Um, but if you're able to take something

454
00:20:06.175 --> 00:20:08.855
and apply it to a migraine problem, uh,

455
00:20:08.915 --> 00:20:11.775
the adoption really becomes high Mm-Hmm.

456
00:20:12.215 --> 00:20:14.175
Because, uh, people get immediate relief.

457
00:20:14.555 --> 00:20:17.415
And so the creation, we're a small company, so the ability

458
00:20:17.415 --> 00:20:20.775
to create content fast, have it be relatively accurate,

459
00:20:21.235 --> 00:20:23.495
and then give the humans that are in the loop something

460
00:20:23.495 --> 00:20:25.655
to really work with that's already tangible.

461
00:20:25.655 --> 00:20:26.815
Mm-Hmm. Just sped us up.

462
00:20:27.155 --> 00:20:29.135
Yep. Totally agree. Great.

463
00:20:30.095 --> 00:20:31.065
Glad to hear that. That's working

464
00:20:31.065 --> 00:20:32.025
really well for you, Andre.

465
00:20:32.215 --> 00:20:35.755
Yeah. Alright, Kara, take us home. What's, uh, yeah,

466
00:20:35.835 --> 00:20:38.595
I mean, I agree with everything that, um,

467
00:20:38.615 --> 00:20:39.675
my colleagues here said.

468
00:20:39.875 --> 00:20:41.355
I think that one thing that, um,

469
00:20:41.565 --> 00:20:43.595
pains me a little bit is when I hear everyone say,

470
00:20:43.795 --> 00:20:44.755
everyone heard about, you know,

471
00:20:44.755 --> 00:20:46.275
generative AI in the same day.

472
00:20:46.335 --> 00:20:48.605
And I'm fortunate that I, I founded Mm-Hmm.

473
00:20:48.605 --> 00:20:52.485
The ETS AI Labs in, um, you know, 2020, 4 years ago. Yeah.

474
00:20:52.485 --> 00:20:54.645
And I was fortunate to come to an organization

475
00:20:54.645 --> 00:20:56.805
that had been leveraging generative ai Mm-Hmm.

476
00:20:56.905 --> 00:20:58.965
For many years and AI scoring for decades.

477
00:20:59.465 --> 00:21:00.965
So we had a wonderful, um, you know,

478
00:21:00.965 --> 00:21:02.365
foundation upon which to build.

479
00:21:03.065 --> 00:21:04.965
Uh, as you know, my colleagues here,

480
00:21:04.965 --> 00:21:07.165
and especially Pat, has said things have really,

481
00:21:07.185 --> 00:21:08.285
really accelerated.

482
00:21:08.305 --> 00:21:10.045
Yes. Um, in the four years since we,

483
00:21:10.055 --> 00:21:11.525
since we started the labs here.

484
00:21:12.025 --> 00:21:15.125
And, you know, we're able to see moving from automated item

485
00:21:15.125 --> 00:21:18.245
generation to truly automated content generation

486
00:21:18.245 --> 00:21:20.005
to support things like test prep.

487
00:21:20.025 --> 00:21:21.605
And it's no more, no longer just items,

488
00:21:21.625 --> 00:21:22.685
but it's feedback Yes.

489
00:21:22.685 --> 00:21:24.725
And rationales that can be developed at scale.

490
00:21:25.155 --> 00:21:27.605
It's, it targeted insights that can be developed at scale.

491
00:21:28.105 --> 00:21:30.365
But a a little, a different place that I wanted to go

492
00:21:30.365 --> 00:21:31.765
with this question is, um,

493
00:21:31.995 --> 00:21:35.365
also at et s we're thinking about, you know, moving, um,

494
00:21:35.635 --> 00:21:38.365
expanding beyond sort of high stake, standardized assessment

495
00:21:38.465 --> 00:21:40.885
to more formative diagnostic assessments.

496
00:21:41.025 --> 00:21:43.405
And with the, with that personalization,

497
00:21:43.795 --> 00:21:46.525
it's really important to be developing socio culturally

498
00:21:46.845 --> 00:21:47.965
relevant content at scale

499
00:21:48.225 --> 00:21:50.085
as we know individuals demonstrate Mm-Hmm.

500
00:21:50.165 --> 00:21:50.885
What they can do most

501
00:21:50.885 --> 00:21:52.205
meaningfully when it's relevant to them.

502
00:21:52.665 --> 00:21:56.205
And so by bringing generative AI capabilities together

503
00:21:56.515 --> 00:22:00.645
with really tuned personalization filters, it's allowed us

504
00:22:00.845 --> 00:22:03.685
to create great efficiencies with building targeted content

505
00:22:03.905 --> 00:22:05.445
for populations around the globe,

506
00:22:05.615 --> 00:22:06.765
which I think is important.

507
00:22:07.225 --> 00:22:10.245
The last kind of point I'll say about this is I also want

508
00:22:10.245 --> 00:22:12.885
to note, um, you know, our data scientists are using it

509
00:22:12.885 --> 00:22:15.085
for coding so they can spend their time really thinking

510
00:22:15.085 --> 00:22:17.685
about the, um, the interpretation of that analysis,

511
00:22:17.685 --> 00:22:18.925
rather the conduction of it,

512
00:22:19.865 --> 00:22:23.005
our UI UX team are thinking about how they can use it for

513
00:22:23.185 --> 00:22:24.325
you know, image generation

514
00:22:24.325 --> 00:22:25.845
and things like this as a place to start.

515
00:22:26.345 --> 00:22:29.325
So I just am really impressed with, um, all

516
00:22:29.325 --> 00:22:30.885
of the creative ways I think

517
00:22:30.885 --> 00:22:32.525
that we're figuring out how to leverage it now.

518
00:22:33.035 --> 00:22:34.365
Yeah. If I can just say real quick,

519
00:22:34.825 --> 00:22:36.165
I'm glad you brought up the it part

520
00:22:36.165 --> 00:22:38.285
because we didn't, weren't really talking about that,

521
00:22:38.425 --> 00:22:41.805
but that's a huge area where you're gonna see a lot of use

522
00:22:41.985 --> 00:22:43.805
and innovation and where you can learn from.

523
00:22:44.345 --> 00:22:47.045
And I think we can revise the statement to say, everyone

524
00:22:47.045 --> 00:22:49.565
who did not already know about AI found out

525
00:22:49.565 --> 00:22:50.765
about it on that one day.

526
00:22:50.915 --> 00:22:53.325
Yeah. Which is not how knowledge is usually transferred.

527
00:22:53.325 --> 00:22:54.365
Right. So that's,

528
00:22:55.385 --> 00:22:56.405
I'm glad they made it cool, because

529
00:22:57.245 --> 00:22:58.725
I'm glad that made it cool and exciting,

530
00:22:58.905 --> 00:23:00.845
but it was, it was a lot Mm-Hmm.

531
00:23:01.185 --> 00:23:02.185
At once. Right.

532
00:23:03.335 --> 00:23:06.505
Awesome. Well, fantastic answers everybody.

533
00:23:06.845 --> 00:23:09.345
Uh, so we're gonna shift a little bit into something

534
00:23:09.345 --> 00:23:11.305
that a couple of you have already touched on a little bit.

535
00:23:11.305 --> 00:23:13.505
The next set of questions relates a little bit

536
00:23:13.505 --> 00:23:15.865
to the ethical considerations of AI.

537
00:23:16.445 --> 00:23:20.025
So as AI does continue to evolve, what ongoing

538
00:23:20.585 --> 00:23:23.345
responsibilities do assessment professionals have

539
00:23:23.885 --> 00:23:24.945
in monitoring

540
00:23:25.125 --> 00:23:28.905
and mitigating ethical risks associated with this use, such

541
00:23:28.905 --> 00:23:30.225
as bias as we brought up already?

542
00:23:30.765 --> 00:23:32.665
Um, you know, what kind of ethical guidelines

543
00:23:32.685 --> 00:23:35.545
or frameworks should we be considering as,

544
00:23:35.605 --> 00:23:37.425
as we're venturing into this area?

545
00:23:37.925 --> 00:23:39.705
So, Andre, I'll, I'll start with you on this one.

546
00:23:40.385 --> 00:23:41.725
You know, I'm gonna respond

547
00:23:41.725 --> 00:23:43.645
to your question in sort of an abstract way.

548
00:23:43.825 --> 00:23:46.885
Um, because my, my personal point of view is

549
00:23:46.885 --> 00:23:51.045
that I believe we are hypersensitive to the misuse of AI.

550
00:23:51.205 --> 00:23:52.725
I think we're very, very sensitive.

551
00:23:53.255 --> 00:23:55.405
We've, um, watched how things

552
00:23:55.425 --> 00:23:57.005
during the industrial revolution

553
00:23:57.025 --> 00:23:58.725
and other areas of advancement in

554
00:23:58.725 --> 00:24:00.085
society has misused people.

555
00:24:00.545 --> 00:24:02.685
Um, so as a result of that, you have a lot

556
00:24:02.685 --> 00:24:06.165
of public scrutiny, personal scrutiny, government scrutiny.

557
00:24:06.385 --> 00:24:08.525
You have the governments of various countries all weighing

558
00:24:08.585 --> 00:24:11.605
in on various guidelines on the ethical use of AI.

559
00:24:12.025 --> 00:24:15.005
Um, so I believe that there is a sensitivity there.

560
00:24:15.565 --> 00:24:18.485
I think that sensitivity has always been built into the

561
00:24:18.485 --> 00:24:19.565
assessment community.

562
00:24:19.945 --> 00:24:21.005
Um, because even

563
00:24:21.005 --> 00:24:24.445
before there was AI, there was always a focus on fairness,

564
00:24:24.675 --> 00:24:27.125
reliability, legal, defensibility, you know,

565
00:24:27.125 --> 00:24:28.525
keeping bias out of the game.

566
00:24:28.705 --> 00:24:32.285
So that's wired into our DNAI think our ethical

567
00:24:32.285 --> 00:24:34.485
responsibility is not to avoid it because

568
00:24:35.125 --> 00:24:38.405
'cause there's so much electricity, there's a part of you

569
00:24:38.405 --> 00:24:39.845
that may say, you know what, that's bad.

570
00:24:39.875 --> 00:24:42.485
It's, it's bad juju, I'm gonna stay away from all of that.

571
00:24:43.305 --> 00:24:45.685
Um, but, but I think it's unethical

572
00:24:45.785 --> 00:24:47.365
to take the avoidance route.

573
00:24:47.825 --> 00:24:51.325
Um, so to me, as long as you have guidelines,

574
00:24:51.345 --> 00:24:53.485
you follow some of the best practices to me, first,

575
00:24:53.705 --> 00:24:55.565
do no harm, whatever we do,

576
00:24:55.635 --> 00:24:57.325
just make sure we do no harm in it.

577
00:24:57.745 --> 00:25:02.085
Um, two, keep humans in the loop and as we start,

578
00:25:02.145 --> 00:25:03.685
and the third one is as we start

579
00:25:03.685 --> 00:25:05.045
to think about applications,

580
00:25:05.155 --> 00:25:08.885
make sure we introduce non-destructive applications of AI

581
00:25:09.465 --> 00:25:12.405
to ensure that whatever we're implementing doesn't have a

582
00:25:12.605 --> 00:25:14.325
permanent, a permanent destructive effect.

583
00:25:14.745 --> 00:25:15.805
Um, those things

584
00:25:15.825 --> 00:25:18.325
and those risks will escalate as we go along.

585
00:25:18.425 --> 00:25:20.125
But we wanna make sure as we're learning about

586
00:25:20.125 --> 00:25:21.965
what this tool can do for us, uh,

587
00:25:21.965 --> 00:25:23.765
that we're not just leaving it up to the robot

588
00:25:23.765 --> 00:25:25.685
to make decisions for us, that we, we still have

589
00:25:25.685 --> 00:25:26.725
to perfect and improve it.

590
00:25:27.025 --> 00:25:31.645
Mm-Hmm. Kimberly curious for your thoughts on this.

591
00:25:32.075 --> 00:25:33.485
Okay. So Andre hacked my notes

592
00:25:33.505 --> 00:25:35.445
and said everything I was gonna say, but better.

593
00:25:36.105 --> 00:25:38.245
So I'm having to recalibrate here for a second.

594
00:25:38.845 --> 00:25:39.845
I, I Chat GPTed

595
00:25:39.905 --> 00:25:42.325
and said, what would Kim say? And that's what I have.

596
00:25:42.635 --> 00:25:44.205
Yeah. You, you nailed it.

597
00:25:44.305 --> 00:25:48.885
Um, so I would, um, just change the focus a little bit

598
00:25:48.945 --> 00:25:50.925
and say ditto to everything Andre just said,

599
00:25:51.385 --> 00:25:54.805
but also I think in addition to the way

600
00:25:54.805 --> 00:25:57.685
that we convey information out about our standards

601
00:25:57.685 --> 00:26:00.245
with guides such as the standards, um,

602
00:26:00.385 --> 00:26:02.965
and other publications, there's going to be a lot

603
00:26:02.965 --> 00:26:05.805
of pressure for us to communicate out our ethical

604
00:26:05.805 --> 00:26:07.125
guidelines as well.

605
00:26:07.745 --> 00:26:11.165
Um, and probably already is pressure in terms of, you know,

606
00:26:11.165 --> 00:26:13.165
some organizations already putting guidelines out there.

607
00:26:13.825 --> 00:26:16.085
You're gonna get asked a lot of questions about that.

608
00:26:16.705 --> 00:26:19.205
Um, I've been doing a lot of reading in the,

609
00:26:19.385 --> 00:26:21.085
the ethical AI field,

610
00:26:21.385 --> 00:26:23.045
and obviously most of

611
00:26:23.045 --> 00:26:25.005
that is not in the assessment arena yet.

612
00:26:25.145 --> 00:26:26.205
So, um, there's a lot

613
00:26:26.205 --> 00:26:27.925
of good information out there outside that.

614
00:26:28.465 --> 00:26:31.565
But I would say a couple things to keep in mind.

615
00:26:32.105 --> 00:26:34.765
Um, one is that one of the criticisms

616
00:26:34.765 --> 00:26:37.685
that you see about organizations having a set

617
00:26:37.685 --> 00:26:41.525
of ethical principles is that they're vague.

618
00:26:41.715 --> 00:26:43.765
They're too high level, um,

619
00:26:43.985 --> 00:26:47.085
and they don't really, um, imply any accountability.

620
00:26:47.385 --> 00:26:49.805
So you can put out a set of ethical guidelines, but

621
00:26:49.825 --> 00:26:52.685
unless you're talking about specific, you know, areas

622
00:26:52.755 --> 00:26:54.565
that tip over into legal restrictions

623
00:26:54.565 --> 00:26:56.325
that might apply based on legislation,

624
00:26:56.325 --> 00:26:58.525
there's not really accountability for you following them.

625
00:26:59.105 --> 00:27:02.605
And you may be setting up promises that you're not enabling

626
00:27:03.145 --> 00:27:05.205
the staff within your organization to keep.

627
00:27:05.785 --> 00:27:08.045
Um, so one way to address this, which is

628
00:27:08.045 --> 00:27:09.685
what we are considering, is essentially you have two

629
00:27:09.685 --> 00:27:10.765
sets of ethical guidelines.

630
00:27:11.145 --> 00:27:14.325
You do have an external one that you publish that reflects,

631
00:27:14.905 --> 00:27:18.525
um, the ways in which your organization is using AI

632
00:27:18.785 --> 00:27:20.365
and the ethical guidelines

633
00:27:20.365 --> 00:27:21.885
and principles you've set around it.

634
00:27:22.425 --> 00:27:25.045
But then also to have an internal set of guidelines

635
00:27:25.195 --> 00:27:28.125
that are more detailed that would get into some

636
00:27:28.125 --> 00:27:29.205
of the specifics of

637
00:27:29.205 --> 00:27:30.565
what following ethical guidelines

638
00:27:30.565 --> 00:27:31.885
really means for your staff.

639
00:27:32.425 --> 00:27:34.365
Um, I've seen the main list framed as well, this is

640
00:27:34.365 --> 00:27:35.685
what you need to tell your developers if

641
00:27:35.685 --> 00:27:36.765
you're working at a tech company.

642
00:27:37.465 --> 00:27:40.525
But, you know, uh, for like one of our organizations,

643
00:27:40.785 --> 00:27:43.165
it could be something like your high level external

644
00:27:43.165 --> 00:27:45.925
guideline has to do with human oversight

645
00:27:46.585 --> 00:27:48.685
and the way in which, um, you know,

646
00:27:48.685 --> 00:27:51.605
we're always gonna have human oversight if we're using AI in

647
00:27:51.605 --> 00:27:52.765
any part of item development.

648
00:27:53.065 --> 00:27:55.485
But then you might have a corresponding internal guideline

649
00:27:55.485 --> 00:27:57.965
that says, every item we produce with ai,

650
00:27:58.365 --> 00:27:59.885
a human expert is going to review it

651
00:27:59.945 --> 00:28:01.085
before it goes on the exam.

652
00:28:01.225 --> 00:28:02.525
So it might actually drill down.

653
00:28:02.985 --> 00:28:06.325
And if nothing else, that will enable cohesiveness

654
00:28:06.345 --> 00:28:08.805
and communication within your company about the fact

655
00:28:08.805 --> 00:28:11.205
that you are following a set of guidelines.

656
00:28:11.385 --> 00:28:12.845
And then, I am not a lawyer.

657
00:28:13.045 --> 00:28:15.005
I cannot speak to all the regulations.

658
00:28:15.745 --> 00:28:18.165
Um, but just trying to keep up with those

659
00:28:18.345 --> 00:28:21.125
as those come out is a challenge in and of itself.

660
00:28:21.225 --> 00:28:24.085
So if you don't already have some lawyers on your

661
00:28:24.845 --> 00:28:26.925
internal teams talking about AI, you need them.

662
00:28:27.195 --> 00:28:30.685
They're great people. And when things like the AI act from

663
00:28:30.685 --> 00:28:35.085
the EU that was passed, um, this month, that's 108 pages.

664
00:28:35.625 --> 00:28:37.605
I'm still going through it. It's a lot to digest,

665
00:28:38.225 --> 00:28:41.045
but it does sound like even though it had been in the works

666
00:28:41.105 --> 00:28:44.005
for a while, it may have been updated recently, specifically

667
00:28:44.005 --> 00:28:45.285
because of generative AI.

668
00:28:46.135 --> 00:28:48.755
So those kind of things are absolutely gonna be out there.

669
00:28:49.055 --> 00:28:50.355
And you know that

670
00:28:50.415 --> 00:28:52.435
that's gonna be focused on the topic of risks.

671
00:28:52.455 --> 00:28:55.315
So you're gonna be thinking about both how do we practice

672
00:28:56.095 --> 00:28:57.195
AI within our organization?

673
00:28:57.255 --> 00:28:58.715
How do we reassure stakeholders

674
00:28:58.715 --> 00:28:59.915
that we're doing it ethically?

675
00:29:00.415 --> 00:29:02.275
And then, you know, how much are we willing

676
00:29:02.275 --> 00:29:03.795
to risk with our innovation?

677
00:29:04.135 --> 00:29:05.835
Mm-Hmm. So, yeah, I'll leave it at that.

678
00:29:06.685 --> 00:29:10.475
Thank you. So on the other side of things,

679
00:29:10.615 --> 00:29:14.035
are there scenarios where AI can help us reduce bias

680
00:29:14.585 --> 00:29:15.635
instead of introduce?

681
00:29:15.895 --> 00:29:17.835
So Kara, I'd like to start with you on this one.

682
00:29:18.665 --> 00:29:20.355
Sure. I, I love this question.

683
00:29:20.535 --> 00:29:23.075
Um, I spend a lot, I, first of all, I thank, um, Kimberly,

684
00:29:23.075 --> 00:29:24.395
all the insight you just provided was

685
00:29:24.395 --> 00:29:25.835
so insightful, really, really helpful.

686
00:29:25.855 --> 00:29:29.195
And I love the idea of the two sets of, um, of guidelines.

687
00:29:29.195 --> 00:29:30.515
Thanks for that recommendation.

688
00:29:30.975 --> 00:29:33.635
But I, and I love talking about ethical and responsible AI.

689
00:29:33.815 --> 00:29:36.155
Um, I also love flipping on it's head a little bit.

690
00:29:36.215 --> 00:29:39.275
I'm thinking about how AI, how the introduction

691
00:29:39.275 --> 00:29:41.565
and implementation of AI can reduce bias

692
00:29:41.665 --> 00:29:42.685
and increase equity.

693
00:29:42.985 --> 00:29:44.725
And I think there are a couple of examples

694
00:29:44.725 --> 00:29:45.725
where this can happen.

695
00:29:46.305 --> 00:29:48.965
The first, um, is just something really tactical, uh,

696
00:29:48.965 --> 00:29:53.005
here at ETS for example, um, my colleagues, Armin McConney

697
00:29:53.005 --> 00:29:54.485
and Erin Scully led the development

698
00:29:54.485 --> 00:29:56.645
of a new solution called, uh, TOEFL TestReady.

699
00:29:56.665 --> 00:30:00.045
And by using generative AI, we were able

700
00:30:00.045 --> 00:30:03.525
to develop a lot more free content for our test takers.

701
00:30:03.585 --> 00:30:06.445
And we wouldn't have had the opportunity to develop all of

702
00:30:06.445 --> 00:30:08.445
that free content without generative AI

703
00:30:08.445 --> 00:30:10.005
because it was so expensive in the past.

704
00:30:10.185 --> 00:30:11.925
And so, because we were able to do it rapidly

705
00:30:12.265 --> 00:30:14.325
and, um, you know, get it out in time, we're able

706
00:30:14.325 --> 00:30:17.565
to now provide so much more free content to really, I think,

707
00:30:17.565 --> 00:30:20.205
level the playing field for individuals that are engaging

708
00:30:20.235 --> 00:30:22.645
with, um, specifically high stake standardized assessment.

709
00:30:23.185 --> 00:30:25.885
But then to a point that Andre made in the beginning of the,

710
00:30:25.885 --> 00:30:27.005
um, of the beginning of the session

711
00:30:27.005 --> 00:30:28.925
that I think is incredible is that

712
00:30:29.785 --> 00:30:33.045
AI is allowing individuals to demonstrate skills

713
00:30:33.155 --> 00:30:35.405
that they may not even have known that they had

714
00:30:35.945 --> 00:30:37.365
and in different ways,

715
00:30:37.545 --> 00:30:40.845
and allow us to provide signals on those skills such

716
00:30:40.845 --> 00:30:44.085
that they're able to understand what future opportunity

717
00:30:44.345 --> 00:30:47.005
or, um, or, or what future sort

718
00:30:47.005 --> 00:30:48.645
of progress there is for them.

719
00:30:48.865 --> 00:30:50.605
And I don't know, when we focus

720
00:30:50.705 --> 00:30:52.485
so much on the traditional assessments

721
00:30:52.485 --> 00:30:53.485
that we have implemented

722
00:30:53.585 --> 00:30:57.205
before AI, we didn't offer the opportunity for individuals

723
00:30:57.205 --> 00:30:59.045
to demonstrate what they know and can do in a way

724
00:30:59.045 --> 00:31:00.205
that's most meaningful to them.

725
00:31:01.035 --> 00:31:04.325
Certainly. Um, I say that when a lot of the, uh,

726
00:31:04.335 --> 00:31:06.845
novel assessments that we're developing here at ETS

727
00:31:07.205 --> 00:31:09.845
leverage, multimodal ai, we have a whole new set

728
00:31:09.845 --> 00:31:12.085
of challenges and a whole new set of considerations

729
00:31:12.085 --> 00:31:13.205
that we need to think about.

730
00:31:13.745 --> 00:31:16.725
But at the end of the day, I really think that, um, as long

731
00:31:16.725 --> 00:31:19.285
as we are maintaining appropriate product development

732
00:31:19.285 --> 00:31:22.005
guidelines, appropriate assessment development guidelines,

733
00:31:22.225 --> 00:31:25.605
um, and then really integrating AI responsibly,

734
00:31:25.825 --> 00:31:28.845
we have the opportunity to provide opportunities for equity

735
00:31:28.845 --> 00:31:30.605
where they didn't otherwise, um, exist.

736
00:31:32.905 --> 00:31:35.445
Great. Pat, do you have anything to add?

737
00:31:36.615 --> 00:31:38.465
Well, just, I wanna highlight back

738
00:31:38.525 --> 00:31:40.025
to something Kimberly talked about

739
00:31:40.245 --> 00:31:43.145
and, uh, I don't remember earlier, um, you know,

740
00:31:43.175 --> 00:31:44.505
what NBME was doing

741
00:31:44.525 --> 00:31:47.545
and they were using, you know, natural language processing

742
00:31:47.545 --> 00:31:48.705
to sort of identify bias.

743
00:31:48.965 --> 00:31:52.745
And I do think the conversation has a little bit been more

744
00:31:53.105 --> 00:31:56.665
slanted towards the negative issues of, of bias with ai.

745
00:31:56.965 --> 00:31:59.225
And clearly they're there. I mean, there are humans involved

746
00:31:59.245 --> 00:32:01.985
and there's some crazy examples if you were at the, uh,

747
00:32:02.325 --> 00:32:05.305
AI at the ATP closing session where they're showing some

748
00:32:05.305 --> 00:32:06.945
of those, they're actually funny, they're so bad.

749
00:32:07.165 --> 00:32:08.825
Um, they're, um, but,

750
00:32:09.605 --> 00:32:13.945
but I'm actually much more bullish on the positive side

751
00:32:13.945 --> 00:32:18.105
of this, that, that we can, um, we can use AI

752
00:32:18.125 --> 00:32:22.425
to look at items and identify patterns that are unintended.

753
00:32:22.845 --> 00:32:26.065
Um, you know, whether it's, uh, you know, using words or,

754
00:32:26.565 --> 00:32:29.425
or linkages that we just, you know, don't want to use.

755
00:32:29.475 --> 00:32:30.825
Right. That sort of thing.

756
00:32:30.885 --> 00:32:34.265
And I actually am hopeful that it'll be more of a golden age

757
00:32:34.825 --> 00:32:36.505
ultimately of, uh, this coming out

758
00:32:36.505 --> 00:32:37.825
of it than, than a negative.

759
00:32:39.455 --> 00:32:42.015
Great. I wanna touch on one more thing

760
00:32:42.015 --> 00:32:44.895
that you all brought up in the ethical discussion,

761
00:32:44.895 --> 00:32:46.975
which is about the importance of the human in the loop.

762
00:32:47.345 --> 00:32:49.335
Right? So I, I'm wondering if in a few words,

763
00:32:49.395 --> 00:32:51.975
you could talk about, you know, how you determine

764
00:32:51.975 --> 00:32:54.655
that threshold for sort of what applications of AI

765
00:32:54.655 --> 00:32:58.095
where a human in the loop is necessary, you know, how

766
00:32:58.095 --> 00:33:00.695
to determine how much human intervention is necessary,

767
00:33:01.315 --> 00:33:03.695
and then finally, sort of how to balance that need

768
00:33:03.715 --> 00:33:05.895
for keeping the human with the need for,

769
00:33:06.235 --> 00:33:08.335
or the desire for increased efficiency.

770
00:33:08.835 --> 00:33:10.455
Uh, so, you know, Kimberly,

771
00:33:10.455 --> 00:33:11.935
curious if you have any thoughts about that.

772
00:33:12.805 --> 00:33:13.935
Sure. I do have a couple.

773
00:33:14.075 --> 00:33:16.735
The main one is that I don't know how you set that rule for

774
00:33:16.785 --> 00:33:18.335
where you need the human oversight

775
00:33:18.335 --> 00:33:19.415
and how much that you need,

776
00:33:19.875 --> 00:33:22.055
but I can recommend a couple things to think about.

777
00:33:22.165 --> 00:33:25.295
There's an old theory, principle, whatever you call it,

778
00:33:25.295 --> 00:33:27.935
from back in the human computer interaction days called HABA

779
00:33:28.005 --> 00:33:29.015
MABA, which is,

780
00:33:29.015 --> 00:33:30.775
humans are better at, machines are better at.

781
00:33:31.665 --> 00:33:34.725
And one thing to keep in mind is that as you're planning AI,

782
00:33:34.725 --> 00:33:37.125
as you're thinking about your responsible use of AI,

783
00:33:37.595 --> 00:33:41.205
that starts with choosing where to use the AI.

784
00:33:41.585 --> 00:33:45.205
And if you're carefully considering each of the steps, you,

785
00:33:45.345 --> 00:33:48.405
you know, run less risk of trying to over automate

786
00:33:48.425 --> 00:33:51.365
or use AI in an area where really the humans are better at

787
00:33:51.365 --> 00:33:55.645
that and you want to retain the humans doing that piece, and

788
00:33:55.645 --> 00:33:58.205
therefore looking for the pieces that are just easier

789
00:33:58.265 --> 00:34:01.205
to automate that the AI is going to be better at.

790
00:34:01.665 --> 00:34:03.685
And then the second really has to do

791
00:34:03.685 --> 00:34:05.885
with your organization's appetite for risk.

792
00:34:06.265 --> 00:34:09.165
You know, a testing organization is gonna be different from,

793
00:34:09.425 --> 00:34:10.605
um, perhaps a vendor

794
00:34:10.785 --> 00:34:14.525
or a tech company where, you know, it,

795
00:34:14.665 --> 00:34:18.925
it depends on what you are comfortable doing with the AI

796
00:34:18.985 --> 00:34:21.645
and risking having mistake if you don't

797
00:34:21.645 --> 00:34:22.685
have the human oversight.

798
00:34:22.745 --> 00:34:24.845
So I mentioned earlier text matching software

799
00:34:25.275 --> 00:34:27.045
that we could use to identify enemies,

800
00:34:27.385 --> 00:34:30.045
but in our case, I'm kind of using shorthand language.

801
00:34:30.185 --> 00:34:31.445
The machine doesn't decide anything.

802
00:34:31.865 --> 00:34:34.725
The software that we develop just prioritizes all the

803
00:34:34.885 --> 00:34:36.165
decisions that the humans need to make

804
00:34:36.185 --> 00:34:38.765
and makes it easier for the humans to do that piece.

805
00:34:39.085 --> 00:34:41.365
Hmm. So if you aren't already thinking about risk,

806
00:34:41.515 --> 00:34:43.165
there's also a lot of that mentioned in,

807
00:34:43.165 --> 00:34:44.805
in the legal guidance that's coming out.

808
00:34:45.225 --> 00:34:46.605
So if you don't already have sort

809
00:34:46.605 --> 00:34:47.805
of a risk assessment process

810
00:34:48.065 --> 00:34:49.725
or a risk scorecard for ai,

811
00:34:50.185 --> 00:34:52.445
that's something I would recommend doing first

812
00:34:52.595 --> 00:34:55.765
because that's going to, in, in addition to other things,

813
00:34:56.035 --> 00:34:58.645
help you identify areas where you're gonna say, no, we have

814
00:34:58.645 --> 00:35:00.205
to keep the human in the loop on this one.

815
00:35:00.385 --> 00:35:01.385
Mm-Hmm.

816
00:35:01.955 --> 00:35:03.585
Great. And Kara, anything to add to that?

817
00:35:04.245 --> 00:35:06.575
Yeah, no, I, I love what Kimberly said about sort

818
00:35:06.575 --> 00:35:09.215
of the technology helping to prioritize for humans.

819
00:35:09.475 --> 00:35:11.895
And one of the, um, one of the innovations

820
00:35:11.895 --> 00:35:13.935
that I find incredibly, um, impressive

821
00:35:13.955 --> 00:35:17.215
and holds a lot of promise are integrating confidence

822
00:35:17.215 --> 00:35:20.095
measures inside the algorithms that help you understand when

823
00:35:20.095 --> 00:35:22.375
to kick out those items or when to kick out the scores

824
00:35:22.435 --> 00:35:23.615
to have humans review them.

825
00:35:23.615 --> 00:35:25.575
And when we know that potentially this is one

826
00:35:25.575 --> 00:35:28.095
that doesn't need as many reviews or something like that.

827
00:35:28.275 --> 00:35:31.535
So I'm, um, those innovations on creating those efficiencies

828
00:35:31.555 --> 00:35:33.655
for the humans that are in the loop, I think is incredibly,

829
00:35:33.655 --> 00:35:34.735
incredibly promising.

830
00:35:35.235 --> 00:35:36.895
One other thing I'll add is that, um,

831
00:35:37.385 --> 00:35:40.605
it really is about the consequences of the, uh, decisions

832
00:35:40.605 --> 00:35:42.005
that are being made based on

833
00:35:42.005 --> 00:35:43.085
the assessments that are being taken.

834
00:35:43.625 --> 00:35:46.565
As I mentioned, ETS, of course, we have our, uh,

835
00:35:46.565 --> 00:35:49.085
primary assessments to full TOEIC, GRE, and Praxis,

836
00:35:49.385 --> 00:35:50.805
and those have significant

837
00:35:50.805 --> 00:35:52.045
consequences associated with them.

838
00:35:52.045 --> 00:35:53.325
They are high stakes assessments.

839
00:35:53.505 --> 00:35:56.005
We are also moving toward formative assessment

840
00:35:56.145 --> 00:35:57.285
and diagnostic assessment

841
00:35:57.345 --> 00:35:58.765
and lower and mid stakes assessment.

842
00:35:59.305 --> 00:36:01.925
So the way that I think about it is there is this spectrum,

843
00:36:02.345 --> 00:36:05.005
and I always consider our high stakes assessments will

844
00:36:05.005 --> 00:36:06.405
always have humans in the loop.

845
00:36:06.405 --> 00:36:09.005
Humans will always be reviewing content that's generated.

846
00:36:09.005 --> 00:36:11.485
Humans will always be reviewing scores that are being

847
00:36:11.805 --> 00:36:15.565
provided, and then we move toward what people are referring

848
00:36:15.565 --> 00:36:17.925
to as humans on the loop, which is more

849
00:36:17.925 --> 00:36:20.565
of this supervisory role that humans are playing.

850
00:36:20.945 --> 00:36:23.685
And so it might be that humans have been in the loop,

851
00:36:23.715 --> 00:36:26.565
they have generated the models, they have built the rubrics,

852
00:36:26.565 --> 00:36:27.765
they have deployed the models,

853
00:36:28.105 --> 00:36:30.405
and then we've let the models go into formative assessments

854
00:36:30.405 --> 00:36:31.605
or into learning products.

855
00:36:31.865 --> 00:36:35.165
And then those humans are continually monitoring the output

856
00:36:35.165 --> 00:36:37.725
of those models and continuing to evaluate them

857
00:36:37.785 --> 00:36:39.365
and might step in when they need

858
00:36:39.365 --> 00:36:40.565
to be fine tuned or tweaked.

859
00:36:40.905 --> 00:36:43.485
But I think about it as high stakes assessment.

860
00:36:43.615 --> 00:36:45.005
We'll always have humans in the loop

861
00:36:45.065 --> 00:36:46.885
as we move toward lower mid stakes assessment.

862
00:36:46.885 --> 00:36:48.845
It'll be humans on the loop supervisory.

863
00:36:49.105 --> 00:36:52.165
But I don't think humans can ever really be

864
00:36:52.165 --> 00:36:53.285
removed from the loop.

865
00:36:53.435 --> 00:36:54.525
It's just not reasonable

866
00:36:54.525 --> 00:36:56.405
because we need subject matter experts

867
00:36:56.425 --> 00:36:57.725
to be developing the models.

868
00:36:57.825 --> 00:36:59.405
We need folks to be deploying them.

869
00:36:59.875 --> 00:37:02.605
It's just a matter of pairing the consequences

870
00:37:02.705 --> 00:37:04.805
of the assessment or the use of the scores

871
00:37:05.195 --> 00:37:06.605
with the oversight of the humans.

872
00:37:07.145 --> 00:37:09.315
Mm-Hmm. I like that.

873
00:37:09.315 --> 00:37:11.355
Human in the loop versus human on the loop.

874
00:37:11.425 --> 00:37:13.035
It's first, I don't know if you're gonna

875
00:37:13.035 --> 00:37:14.155
coin that phrase, Kara, but

876
00:37:14.625 --> 00:37:15.595
It's not mine. It's

877
00:37:15.595 --> 00:37:16.595
Not mine.

878
00:37:17.775 --> 00:37:20.235
Uh, all right. I wanna shift gears a teeny bit

879
00:37:20.235 --> 00:37:22.555
because Pat Ward, you mentioned security at the beginning

880
00:37:22.555 --> 00:37:24.635
of this, and I think that is a sort of a big topic

881
00:37:24.635 --> 00:37:26.995
that we should at least touch on.

882
00:37:27.895 --> 00:37:29.235
Uh, you know, this is another area

883
00:37:29.245 --> 00:37:32.475
where AI poses both new benefits and new challenges.

884
00:37:33.255 --> 00:37:36.075
So, you know, AI has the potential to enhance the security

885
00:37:36.075 --> 00:37:37.555
of assessments with new technologies.

886
00:37:38.135 --> 00:37:41.035
Uh, but also there are new advantages for cheaters as well.

887
00:37:41.295 --> 00:37:43.555
So I'm curious if you can share your perspective

888
00:37:43.605 --> 00:37:44.995
about AI and security.

889
00:37:45.695 --> 00:37:47.635
It, it's funny 'cause I was listening

890
00:37:47.635 --> 00:37:48.915
to Kara on the last answer.

891
00:37:49.055 --> 00:37:52.875
And um, you know, I I think part of the, that answer is

892
00:37:52.875 --> 00:37:53.995
that the stakes matter,

893
00:37:54.095 --> 00:37:56.435
and it matters from a security perspective too.

894
00:37:56.505 --> 00:37:57.675
It's, it matters for the content.

895
00:37:58.495 --> 00:38:01.635
Um, the, the first thing I want to say is

896
00:38:01.635 --> 00:38:06.475
that AI makes it easier to cheat in a remote proctored test.

897
00:38:06.935 --> 00:38:10.035
Um, and I, I wanna, I just wanna call that out

898
00:38:10.135 --> 00:38:11.275
as a, as a key point.

899
00:38:11.295 --> 00:38:12.955
And, and this is not just a theory.

900
00:38:13.155 --> 00:38:15.675
I mean, we, at ITS, we actually created a proof of concept

901
00:38:15.805 --> 00:38:19.115
where we capture the item off the screen, we send it off,

902
00:38:19.655 --> 00:38:23.275
um, to, to be scored, and we get back an answer,

903
00:38:23.375 --> 00:38:25.435
and then we automatically mark the answer

904
00:38:25.455 --> 00:38:26.955
or show some information on the screen.

905
00:38:27.575 --> 00:38:29.875
And this'll work in just about anybody's test.

906
00:38:30.175 --> 00:38:32.115
Um, so we know it's there.

907
00:38:32.255 --> 00:38:35.725
And this, this means there's a few big changes from I think

908
00:38:35.735 --> 00:38:38.005
where we've been used to, first of all,

909
00:38:38.005 --> 00:38:39.805
we're making it easier for the cheater.

910
00:38:39.905 --> 00:38:42.285
Um, you don't have to like call up some, you know,

911
00:38:42.495 --> 00:38:44.965
slimy proxy tester and let them in your computer

912
00:38:44.985 --> 00:38:46.285
and have a relationship with them.

913
00:38:46.625 --> 00:38:48.285
Now it's just using a piece of software.

914
00:38:48.355 --> 00:38:50.205
It's a lot easier for that casual cheater.

915
00:38:50.745 --> 00:38:53.205
Um, maybe one of the more interesting ones though, is

916
00:38:53.525 --> 00:38:55.805
that large item pools are no longer the answer, right?

917
00:38:56.005 --> 00:38:57.725
I mean, for years what I said was, Hey,

918
00:38:57.995 --> 00:38:59.245
just create more items.

919
00:38:59.425 --> 00:39:02.685
You know, um, you got 10,000 items, people can't, they,

920
00:39:02.685 --> 00:39:04.965
they can't, you know, uh, memorize all them. Well,

921
00:39:04.965 --> 00:39:06.005
that that's not true anymore.

922
00:39:06.005 --> 00:39:08.365
If the AI, and there, there's a related piece to this,

923
00:39:08.365 --> 00:39:12.005
which is that the timing isn't the solution as well.

924
00:39:12.385 --> 00:39:14.165
Um, you know, um, you know,

925
00:39:14.165 --> 00:39:17.405
making an item timed in one minute is not going to actually,

926
00:39:17.585 --> 00:39:19.565
you know, so solve this particular problem.

927
00:39:19.585 --> 00:39:20.885
It does, it does help, right?

928
00:39:20.885 --> 00:39:23.685
If you're doing a, a test in a browser where you know,

929
00:39:23.685 --> 00:39:26.165
they're, you're, you're, it's a very low stakes test.

930
00:39:26.345 --> 00:39:29.885
Yes. Um, they don't have time to like bring up another tool

931
00:39:29.905 --> 00:39:31.005
and type in a search,

932
00:39:31.105 --> 00:39:32.525
but using an automated tool,

933
00:39:32.525 --> 00:39:33.845
which is pulling content from the screen

934
00:39:33.845 --> 00:39:34.765
and just doing all that work

935
00:39:34.765 --> 00:39:35.845
for you, it doesn't help at all.

936
00:39:36.385 --> 00:39:38.645
Um, so, you know, the, the solutions

937
00:39:38.665 --> 00:39:41.525
to this are basically technology, you know,

938
00:39:41.525 --> 00:39:43.685
and, you know, better technology to block it.

939
00:39:43.685 --> 00:39:45.165
That's mostly like a secure browser thing.

940
00:39:45.465 --> 00:39:48.605
And then it still comes back to test design,

941
00:39:48.685 --> 00:39:52.125
assessment design, uh, item types, the structure

942
00:39:52.145 --> 00:39:54.725
of the item types, how the content is presented on the

943
00:39:54.725 --> 00:39:56.205
screen, how it interacts,

944
00:39:56.205 --> 00:39:58.445
or whether it's a performance based item, um,

945
00:39:58.585 --> 00:40:00.765
how the total test is, is designed.

946
00:40:00.975 --> 00:40:03.525
These things matter more than they have ever mattered.

947
00:40:03.745 --> 00:40:06.445
Um, I also kind of wanna just make a quick this, you know,

948
00:40:06.505 --> 00:40:09.165
people talk about what are your takeaways from ATP?

949
00:40:09.705 --> 00:40:12.965
My biggest takeaway was actually, uh, something I,

950
00:40:13.205 --> 00:40:14.445
I picked up really early

951
00:40:14.505 --> 00:40:17.805
before the conference almost, um, where I, I went to a,

952
00:40:17.925 --> 00:40:21.645
a session and, um, um, it was, it was basically about

953
00:40:21.645 --> 00:40:23.605
how cheaters are a threat to our industry.

954
00:40:23.625 --> 00:40:27.045
And if we, if we let people basically affect the

955
00:40:27.045 --> 00:40:28.445
validity of our test, right?

956
00:40:28.445 --> 00:40:31.525
Where the, the validity becomes questionable to the point

957
00:40:31.525 --> 00:40:33.725
where you don't count, can't,

958
00:40:33.725 --> 00:40:35.165
can't count on somebody who's certified,

959
00:40:35.165 --> 00:40:36.165
it really means something

960
00:40:36.515 --> 00:40:38.285
that pretty much destroys our industry.

961
00:40:38.285 --> 00:40:39.845
That, that kind of thing gets my attention

962
00:40:39.945 --> 00:40:40.965
as a business owner, right?

963
00:40:41.385 --> 00:40:43.405
Um, you know, and it's something that should get all

964
00:40:43.405 --> 00:40:45.125
of our attention, uh, on this.

965
00:40:45.185 --> 00:40:47.405
And I think we have to put an investment in it.

966
00:40:47.425 --> 00:40:49.445
And I wanted to make a little shout out to the group

967
00:40:49.675 --> 00:40:52.445
that was at that, uh, Sunday meeting that I went to,

968
00:40:52.445 --> 00:40:53.765
which was, uh, uh,

969
00:40:53.925 --> 00:40:57.165
a group called the CIAA I can always remember it

970
00:40:57.285 --> 00:40:58.525
'cause of the CIA part.

971
00:40:58.745 --> 00:41:00.605
Um, but it stands for, I'm gonna read this off my screen,

972
00:41:00.685 --> 00:41:03.165
a Credential Integrity Action Alliance.

973
00:41:03.305 --> 00:41:05.405
So I just wanna suggest that you look 'em up

974
00:41:05.705 --> 00:41:07.125
and, uh, you support 'em.

975
00:41:07.305 --> 00:41:08.305
So it's important.

976
00:41:09.525 --> 00:41:11.795
Great. Thanks Pat. Andre, do you have anything to add?

977
00:41:12.415 --> 00:41:14.115
You know, I'll, I'll keep mine short here.

978
00:41:14.215 --> 00:41:17.435
Um, and I agree wholeheartedly, wholeheartedly with Pat.

979
00:41:17.535 --> 00:41:18.915
Um, my company is not

980
00:41:18.915 --> 00:41:21.115
so much in the higher stakes assessment arena.

981
00:41:21.175 --> 00:41:23.635
So what I'm gonna say, maybe a little bit controversial,

982
00:41:23.645 --> 00:41:25.035
maybe a little bit dreamerish.

983
00:41:25.295 --> 00:41:28.595
So just, uh, you know, um, um, uh, humor me.

984
00:41:28.935 --> 00:41:32.075
Um, I, I believe that the greater opportunity is

985
00:41:32.075 --> 00:41:35.885
that AI is going to create opportunities for measurement

986
00:41:35.885 --> 00:41:38.285
where this type of security may not be an

987
00:41:38.285 --> 00:41:39.365
issue so much anymore.

988
00:41:39.865 --> 00:41:43.205
Um, you know, imagine a world where the assessment is not

989
00:41:43.225 --> 00:41:45.165
so much, uh, questions and items.

990
00:41:45.395 --> 00:41:46.845
It's more of, Hey, I click on something

991
00:41:46.845 --> 00:41:49.365
and you watch me work, you watch me have a meeting, uh,

992
00:41:49.365 --> 00:41:50.925
you measure my interaction in that meeting,

993
00:41:51.545 --> 00:41:52.805
and then you tell me whether

994
00:41:52.805 --> 00:41:54.205
or not I have the skill capability

995
00:41:54.225 --> 00:41:56.845
or whatever, that will take me further along in my process.

996
00:41:57.145 --> 00:42:00.365
So, um, I just want us to make sure that we're cautious,

997
00:42:00.365 --> 00:42:03.205
that we're not applying old methodologies to

998
00:42:03.205 --> 00:42:04.485
what is both a novel

999
00:42:04.665 --> 00:42:08.085
and very abstract application of technology such

1000
00:42:08.085 --> 00:42:09.605
that we forget that the people

1001
00:42:09.675 --> 00:42:13.365
that are frustrated about AI can also become very frustrated

1002
00:42:13.365 --> 00:42:14.805
about the testing industry as well.

1003
00:42:15.225 --> 00:42:17.805
Um, so that's just as much of a threat as, uh,

1004
00:42:17.905 --> 00:42:19.285
AI in itself in and of itself.

1005
00:42:19.675 --> 00:42:21.605
Yeah, and I'd agree with Andre too, there.

1006
00:42:21.785 --> 00:42:23.405
I'm, I'm a hundred percent with him on that.

1007
00:42:24.995 --> 00:42:26.985
Fascinating. Alright.

1008
00:42:27.405 --> 00:42:29.945
Um, quickly, I'm curious to know

1009
00:42:30.165 --> 00:42:31.905
how you'd recommend balancing.

1010
00:42:31.905 --> 00:42:33.385
So we talked about risk a little bit, right?

1011
00:42:33.685 --> 00:42:36.425
How you'd recommend balancing the potential competitive edge

1012
00:42:36.425 --> 00:42:38.305
that AI provides your organization

1013
00:42:38.775 --> 00:42:40.345
with the inherent risks that they pose.

1014
00:42:40.445 --> 00:42:41.825
And how should industry leaders

1015
00:42:41.845 --> 00:42:43.785
who are on this call be thinking about risk?

1016
00:42:44.035 --> 00:42:46.265
Andre, I'm gonna keep the momentum with you. Okay.

1017
00:42:46.725 --> 00:42:48.905
So, so as a business person, I'm a,

1018
00:42:48.905 --> 00:42:50.865
I'm a tech person converted to a business person.

1019
00:42:51.415 --> 00:42:52.425
Part of your role

1020
00:42:52.565 --> 00:42:54.265
as a business person is you're taking risks.

1021
00:42:54.375 --> 00:42:56.985
It's not to rinse and repeat what you've learned in history.

1022
00:42:56.985 --> 00:42:58.465
So there's going to be inherent risk

1023
00:42:58.465 --> 00:42:59.545
in what you do every day.

1024
00:43:00.015 --> 00:43:02.065
Frankly, I believe, uh,

1025
00:43:02.365 --> 00:43:04.745
AI at some point will not be a competitive edge.

1026
00:43:04.745 --> 00:43:06.465
It's going to be a competitive necessity.

1027
00:43:06.965 --> 00:43:09.345
Uh, meaning that if you are not utilizing it,

1028
00:43:09.345 --> 00:43:12.625
your competitors or new starts will be actively using it.

1029
00:43:13.005 --> 00:43:16.305
And at some point they will either outmaneuver you become

1030
00:43:16.305 --> 00:43:19.345
more efficient than you, they may even start

1031
00:43:19.445 --> 00:43:23.225
to gain competitive, competitive intelligence about you, uh,

1032
00:43:23.225 --> 00:43:25.705
because they can just ask Chat GPT

1033
00:43:25.735 --> 00:43:27.705
what your company has done or what you're doing

1034
00:43:27.705 --> 00:43:28.825
and what your philosophy is.

1035
00:43:28.825 --> 00:43:30.505
And it'll take a guess at it based upon

1036
00:43:30.535 --> 00:43:31.665
what you have on the internet

1037
00:43:31.665 --> 00:43:33.065
and what other people have learned about you.

1038
00:43:33.565 --> 00:43:36.985
Um, so no, I I, I don't necessarily look at it

1039
00:43:36.985 --> 00:43:37.985
as a competitive advantage,

1040
00:43:38.085 --> 00:43:40.385
but I think it is a novel technology that's going

1041
00:43:40.385 --> 00:43:42.905
to represent a, a necessity for us, uh,

1042
00:43:42.925 --> 00:43:44.385
as we move along this path.

1043
00:43:44.845 --> 00:43:48.325
Mm. Kara do you have any thoughts on this?

1044
00:43:48.785 --> 00:43:50.605
Um, I think that was eloquently said.

1045
00:43:50.605 --> 00:43:54.085
The only thing I will add is, um, folks, I recommend

1046
00:43:54.085 --> 00:43:56.525
that you protect your team, um, at your company.

1047
00:43:56.765 --> 00:43:57.765
I think that's really important.

1048
00:43:57.945 --> 00:43:59.845
Uh, we have a great team here led by Wally,

1049
00:43:59.845 --> 00:44:01.965
who has put out some great safeguards and guidelines.

1050
00:44:02.545 --> 00:44:05.765
And of course, as a person who heads innovation at ETS,

1051
00:44:05.765 --> 00:44:06.765
sometimes we butt heads,

1052
00:44:06.925 --> 00:44:08.165
I wanna do that, I wanna go this way.

1053
00:44:08.165 --> 00:44:09.525
And there are these safeguards, but really

1054
00:44:09.525 --> 00:44:11.645
what we're doing is protecting our team

1055
00:44:11.665 --> 00:44:14.085
and protecting our staff from, um,

1056
00:44:14.325 --> 00:44:16.445
concerns they might not even know or dangers.

1057
00:44:16.445 --> 00:44:17.725
They might not even know what they're getting into.

1058
00:44:17.825 --> 00:44:19.085
And so I think that's, um,

1059
00:44:19.125 --> 00:44:20.925
a great thing an organization can do for their team.

1060
00:44:21.735 --> 00:44:24.085
Great. Pat, any final thoughts?

1061
00:44:24.635 --> 00:44:27.565
Well, I just to build off of what Andre said,

1062
00:44:27.645 --> 00:44:29.565
I I actually think for a provider,

1063
00:44:29.715 --> 00:44:31.620
it's basically table stakes today.

1064
00:44:31.805 --> 00:44:34.125
I mean, like for an item bank, you have to have this today.

1065
00:44:34.245 --> 00:44:35.765
I mean, there we're already to that point

1066
00:44:35.765 --> 00:44:36.965
where you have to build it in.

1067
00:44:37.305 --> 00:44:41.685
Um, I wanted to share a, um, just to re re paraphrase, uh,

1068
00:44:41.685 --> 00:44:44.525
something I heard at the I.C.E. conference, um,

1069
00:44:44.985 --> 00:44:48.565
and what the, the keynote speaker said was that your job

1070
00:44:49.225 --> 00:44:52.605
or business will not be replaced by AI, it'll be replaced

1071
00:44:52.605 --> 00:44:55.045
by someone who knows AI better than you

1072
00:44:55.425 --> 00:44:57.205
and I, you know, that to me is sort

1073
00:44:57.205 --> 00:44:58.285
of the whole piece, right?

1074
00:44:58.385 --> 00:45:02.085
You know, um, invest today, you know, get your processes

1075
00:45:02.085 --> 00:45:04.845
and your best practices, guidelines, get 'em in place and,

1076
00:45:04.985 --> 00:45:06.245
and, and move, right?

1077
00:45:06.315 --> 00:45:08.045
This is not something to ignore.

1078
00:45:09.735 --> 00:45:14.645
Great. Alright, well, we have one more question, uh,

1079
00:45:14.645 --> 00:45:16.765
for the panel before moving on to our q and a.

1080
00:45:16.765 --> 00:45:17.845
We've made great time here

1081
00:45:18.105 --> 00:45:20.005
and I'd like to conclude, you know,

1082
00:45:20.105 --> 00:45:21.565
we all started on a positive note

1083
00:45:21.565 --> 00:45:23.285
and I would also like to end on a positive note

1084
00:45:23.945 --> 00:45:25.525
by talking about the bright future ahead.

1085
00:45:25.945 --> 00:45:27.245
So considering the trajectory

1086
00:45:27.585 --> 00:45:30.645
of AI innovation in our industry and our industry

1087
00:45:30.645 --> 00:45:32.485
and beyond, really, you know,

1088
00:45:32.485 --> 00:45:34.765
what possibilities excite you the most,

1089
00:45:34.765 --> 00:45:38.205
what makes your eyes light up, um, about the, the potential,

1090
00:45:38.225 --> 00:45:40.165
the right possibilities for AI in our industry.

1091
00:45:40.865 --> 00:45:42.365
So try to keep your answers to about a minute

1092
00:45:42.365 --> 00:45:44.525
or less so we have time for Q and A at the end,

1093
00:45:44.985 --> 00:45:46.525
but Kimberly, I'm gonna start with you.

1094
00:45:47.335 --> 00:45:49.185
Yeah. So I'm gonna jump back to something

1095
00:45:49.185 --> 00:45:50.305
that Andre just said

1096
00:45:50.405 --> 00:45:53.385
and say, I am also excited about moving to a world

1097
00:45:53.385 --> 00:45:56.265
that is not just based on MCQs, wonderful as they are,

1098
00:45:56.925 --> 00:45:58.825
but that also means I'm being a little bit retro

1099
00:45:59.005 --> 00:46:01.745
and saying, you know, automated scoring of item text

1100
00:46:01.765 --> 00:46:05.105
or speech, even though that has been around and predates AI.

1101
00:46:05.605 --> 00:46:07.425
Um, that's actually what I'm most excited about

1102
00:46:07.425 --> 00:46:11.025
because if you're starting to talk about some more complex,

1103
00:46:11.325 --> 00:46:14.265
um, constructs that we may want to measure in medicine, like

1104
00:46:15.225 --> 00:46:17.745
specifically clinical, excuse me, clinical reasoning,

1105
00:46:17.745 --> 00:46:19.105
additional communication skills,

1106
00:46:19.155 --> 00:46:21.145
maybe something very ambitious like teamwork.

1107
00:46:21.965 --> 00:46:23.825
Um, and you maybe want to do that

1108
00:46:23.825 --> 00:46:25.065
for a lot of examinees at once.

1109
00:46:25.085 --> 00:46:26.545
You might wanna do it at different stages.

1110
00:46:26.615 --> 00:46:29.625
Undergrad education or graduate medical education.

1111
00:46:30.325 --> 00:46:32.025
Um, even for formative assessment,

1112
00:46:32.045 --> 00:46:33.505
you're gonna need to make that scalable.

1113
00:46:33.605 --> 00:46:35.105
You're gonna need to make it efficient.

1114
00:46:35.245 --> 00:46:36.865
You're gonna have to have humans in the loop

1115
00:46:36.925 --> 00:46:38.105
for some part of it.

1116
00:46:38.205 --> 00:46:40.385
So I am actually very excited about, you know,

1117
00:46:40.385 --> 00:46:43.105
the advancements and enhancements, increased accuracy

1118
00:46:43.125 --> 00:46:45.425
for LLMs and other methods that we can use

1119
00:46:45.425 --> 00:46:46.745
to support automated scoring.

1120
00:46:47.805 --> 00:46:51.635
Great. All right, Andre, what got you excited?

1121
00:46:52.015 --> 00:46:54.515
Uh, uh, everything. I'm, I'm excited for everything.

1122
00:46:54.695 --> 00:46:57.715
Um, now, um, really excited about the new jobs

1123
00:46:57.855 --> 00:46:59.635
and roles that'll be created, you know,

1124
00:46:59.695 --> 00:47:02.995
so we have a new role called Prompts Engineer, right?

1125
00:47:02.995 --> 00:47:05.435
That didn't exist five, 10 years ago.

1126
00:47:05.455 --> 00:47:08.475
So this is going to create other adjacent roles and titles

1127
00:47:08.475 --> 00:47:09.955
and career opportunities for people,

1128
00:47:10.105 --> 00:47:13.035
because really it's about asking the right questions,

1129
00:47:13.035 --> 00:47:14.435
framing the questions up so

1130
00:47:14.435 --> 00:47:15.675
that the computer can understand.

1131
00:47:16.135 --> 00:47:20.195
And then, um, just really, really excited about the new ways

1132
00:47:20.575 --> 00:47:24.315
of, um, allowing curious people to become more curious.

1133
00:47:25.015 --> 00:47:28.355
So now, rather than asking simple questions, you're able

1134
00:47:28.355 --> 00:47:30.765
to ask more complex questions at deeper levels

1135
00:47:30.865 --> 00:47:33.445
and go down further rabbit holes, uh, to be able

1136
00:47:33.445 --> 00:47:34.925
to increase our ability to serve our

1137
00:47:34.925 --> 00:47:36.045
communities and our customers.

1138
00:47:38.075 --> 00:47:41.495
Great. Pat Ward, what gets you excited about the future?

1139
00:47:42.075 --> 00:47:43.335
You know, I, I was thinking

1140
00:47:43.395 --> 00:47:44.935
as I was sitting here listening to these guys

1141
00:47:45.075 --> 00:47:49.055
and, um, yesterday, um, I was, I had a customer meeting

1142
00:47:49.275 --> 00:47:52.095
and, uh, one of the stakeholders said something about how

1143
00:47:52.795 --> 00:47:55.495
she had enjoyed, you know, talking about AI in a way

1144
00:47:55.495 --> 00:47:57.295
that wasn't like what we're talking about today.

1145
00:47:57.355 --> 00:47:59.495
It was actually talking about some very specifics.

1146
00:47:59.515 --> 00:48:01.895
And it is gonna be fun watching this transition

1147
00:48:01.895 --> 00:48:04.255
where we actually get to real world solutions

1148
00:48:04.255 --> 00:48:07.455
and are talking about it, um, like exactly what we did,

1149
00:48:07.575 --> 00:48:08.815
you know, some of the details of

1150
00:48:08.815 --> 00:48:10.375
what Kara's been doing, you know, that kind of stuff.

1151
00:48:10.375 --> 00:48:12.175
And Kimberly, that's a lot of fun.

1152
00:48:12.355 --> 00:48:15.735
Um, and really the fun of the, of the whole thing is, um,

1153
00:48:16.175 --> 00:48:17.375
probably what I'm most excited about.

1154
00:48:17.615 --> 00:48:19.215
I, I like the analysis side.

1155
00:48:19.255 --> 00:48:20.575
I think it's gonna have the biggest bang,

1156
00:48:20.675 --> 00:48:24.015
but, um, it's, it's gonna be, you know, I'm looking forward

1157
00:48:24.015 --> 00:48:27.655
to this moment where the AI is actually improving, uh,

1158
00:48:27.755 --> 00:48:29.855
our assessment, you know, delivery and,

1159
00:48:29.875 --> 00:48:31.015
and how we're taking care

1160
00:48:31.015 --> 00:48:32.295
of people and all that sort of thing.

1161
00:48:32.635 --> 00:48:34.375
Not just cost, not just process,

1162
00:48:34.635 --> 00:48:36.535
but actually where we have a better product

1163
00:48:36.535 --> 00:48:38.375
because of it as a, as an industry, right?

1164
00:48:38.395 --> 00:48:39.575
Not just me, you know?

1165
00:48:39.675 --> 00:48:42.375
Um, and that's gonna be a lot of fun to be a part of. So.

1166
00:48:43.905 --> 00:48:46.255
Great. All right. Kara, final thoughts?

1167
00:48:46.885 --> 00:48:48.255
Yeah, I mean, I agree with everything

1168
00:48:48.255 --> 00:48:49.295
that everyone has said here.

1169
00:48:49.295 --> 00:48:52.695
Certainly my passion lies in really progressing humans,

1170
00:48:52.795 --> 00:48:54.015
and I think we're gonna be able to do that

1171
00:48:54.015 --> 00:48:56.615
with authentic assessment and, and authentic data creation.

1172
00:48:56.675 --> 00:48:59.015
But honestly, what I'm really excited about

1173
00:48:59.015 --> 00:49:01.535
and what, um, what enthusia

1174
00:49:01.605 --> 00:49:04.095
what I get enthusiastic about at work is seeing the new

1175
00:49:04.295 --> 00:49:05.335
collaborations that I'm seeing.

1176
00:49:05.395 --> 00:49:08.135
Mm-Hmm. We have AI engineers who are working hand in hand

1177
00:49:08.135 --> 00:49:10.055
with data scientists and working hand in hand

1178
00:49:10.055 --> 00:49:12.415
with UX designers and, and researchers.

1179
00:49:12.475 --> 00:49:14.775
And I don't think we saw that in our industry before.

1180
00:49:14.975 --> 00:49:17.255
I think AI engineers were in a technology part

1181
00:49:17.255 --> 00:49:19.335
of the organization, and I think that, you know,

1182
00:49:19.335 --> 00:49:21.135
product was in a different side of the organization.

1183
00:49:21.515 --> 00:49:23.895
Seeing those collaborations is really

1184
00:49:23.955 --> 00:49:26.575
how we're meeting the needs of the learners and educators

1185
00:49:26.575 --> 00:49:28.815
and employers and, and job seekers that we serve,

1186
00:49:29.315 --> 00:49:30.615
and really demonstrates

1187
00:49:30.635 --> 00:49:32.975
how we keep the user at the center throughout the entire

1188
00:49:32.975 --> 00:49:34.215
product development lifecycle

1189
00:49:34.395 --> 00:49:37.055
and make sure that AI is powering those learning science

1190
00:49:37.055 --> 00:49:39.535
principles instead of us driving with the technology.

1191
00:49:39.915 --> 00:49:42.295
So those collaborations, um, are really,

1192
00:49:42.375 --> 00:49:43.535
I think, what are gonna drive the future.

1193
00:49:45.855 --> 00:49:47.525
Great. Alright.

1194
00:49:47.525 --> 00:49:50.445
Well, so thank you again to all of our panelists for all

1195
00:49:50.445 --> 00:49:51.885
of your very, very thoughtful,

1196
00:49:51.885 --> 00:49:53.725
insightful answers to all these questions.

1197
00:49:54.265 --> 00:49:56.165
We do have a good 10 minutes for Q

1198
00:49:56.165 --> 00:49:57.325
and A, which is very exciting.

1199
00:49:58.305 --> 00:50:00.005
Um, the first question

1200
00:50:00.905 --> 00:50:04.805
is asking specifically about something with ETS and NBME.

1201
00:50:04.805 --> 00:50:08.885
So Kimberly and Kara be prepared to answer, um, you know,

1202
00:50:08.915 --> 00:50:10.685
this, this individual would like

1203
00:50:10.685 --> 00:50:12.805
to hear more about the bias piece

1204
00:50:12.905 --> 00:50:16.925
and how you all are using AI to address that bias since all

1205
00:50:16.925 --> 00:50:19.285
of your systems are programmed by humans, by people,

1206
00:50:19.745 --> 00:50:22.045
and how you train it beyond the limitations of people.

1207
00:50:22.825 --> 00:50:24.565
So, ETS is listed first,

1208
00:50:24.585 --> 00:50:26.445
so Kara I'll put you on on the spot first

1209
00:50:26.465 --> 00:50:27.685
and then we'll go to Kimberly.

1210
00:50:28.675 --> 00:50:30.645
Sure. So just to read back the question about

1211
00:50:30.645 --> 00:50:33.245
how is ETS addressing potential bias in some

1212
00:50:33.245 --> 00:50:34.285
of the models that we're training?

1213
00:50:34.305 --> 00:50:35.325
Is that accurate? That's

1214
00:50:35.385 --> 00:50:36.385
Yes. Mm-Hmm. So there

1215
00:50:36.385 --> 00:50:37.765
are a couple of ways that we do this.

1216
00:50:37.905 --> 00:50:41.085
Um, certainly it begins with diversity, ensuring

1217
00:50:41.115 --> 00:50:43.725
that the teams that are building the models

1218
00:50:43.835 --> 00:50:46.645
that are annotating the data are diverse

1219
00:50:46.645 --> 00:50:48.605
and representative of the populations that we serve.

1220
00:50:48.945 --> 00:50:52.565
Second, we try to ensure that our data sets are as free

1221
00:50:52.585 --> 00:50:53.645
of bias as possible

1222
00:50:53.645 --> 00:50:55.565
and that are as representative of the samples

1223
00:50:55.565 --> 00:50:57.125
that these models are being trained against.

1224
00:50:57.855 --> 00:50:59.445
Third is, of course, human in the loop,

1225
00:50:59.445 --> 00:51:01.285
as we've talked about, to ensure that, um,

1226
00:51:01.545 --> 00:51:03.405
the biases is reduced that way.

1227
00:51:03.825 --> 00:51:05.605
And then continual evaluation

1228
00:51:05.625 --> 00:51:07.285
and monitoring of those models.

1229
00:51:07.795 --> 00:51:11.965
Certainly we, as I said, um, have different standards

1230
00:51:11.985 --> 00:51:13.805
for our models than, um, than,

1231
00:51:13.945 --> 00:51:15.765
and some products, than other products are

1232
00:51:15.765 --> 00:51:16.805
more innovative products.

1233
00:51:16.865 --> 00:51:18.885
You know, we are learning and we're testing and learning.

1234
00:51:19.225 --> 00:51:22.325
Um, and so there is ways that we're researching how

1235
00:51:22.325 --> 00:51:24.885
to mitigate bias in those new ways of collecting data, like

1236
00:51:24.885 --> 00:51:25.965
through multimodal ai.

1237
00:51:25.985 --> 00:51:27.805
Mm-Hmm. But then in our high stake, standardized,

1238
00:51:27.805 --> 00:51:30.845
we have very rigorous processes for, um, diverse teams,

1239
00:51:30.845 --> 00:51:33.125
diverse trading data, uh, a human in the loop

1240
00:51:33.145 --> 00:51:34.485
and continual evaluation.

1241
00:51:35.935 --> 00:51:39.595
Great. Kimberly, what have you gotta share?

1242
00:51:39.775 --> 00:51:41.995
So, so in our case, I should, um,

1243
00:51:42.625 --> 00:51:44.115
clarify my language a little bit.

1244
00:51:44.575 --> 00:51:46.995
We weren't interested in looking at bias within individual

1245
00:51:46.995 --> 00:51:49.075
items for the reasons that you mentioned.

1246
00:51:49.075 --> 00:51:51.715
That is something that we train authors very carefully on.

1247
00:51:52.215 --> 00:51:55.275
We were looking to see if once you stacked up an entire item

1248
00:51:55.345 --> 00:51:58.515
bank or their perhaps unexpected patterns in

1249
00:51:58.515 --> 00:52:01.515
how language was used in our case to describe patients.

1250
00:52:02.015 --> 00:52:03.075
So one thing that we found

1251
00:52:03.075 --> 00:52:05.675
that was published in our paper was we have items

1252
00:52:05.685 --> 00:52:08.395
where we describe how much alcohol a patient might be

1253
00:52:08.395 --> 00:52:11.325
consuming if we indicate the patient is a male.

1254
00:52:11.515 --> 00:52:14.165
What we found was that alcohol was more like likely

1255
00:52:14.265 --> 00:52:15.405
to be described as beer,

1256
00:52:15.745 --> 00:52:16.925
if the patient was female,

1257
00:52:17.505 --> 00:52:20.205
you can probably guess the alcohol was more likely

1258
00:52:20.205 --> 00:52:21.565
to be described as wine.

1259
00:52:22.305 --> 00:52:24.645
Um, is that a problem? We don't know.

1260
00:52:24.865 --> 00:52:26.205
It could be that that is accurate

1261
00:52:26.225 --> 00:52:28.565
and that's how the breakdown

1262
00:52:28.565 --> 00:52:30.925
of patient portrayal should be in our item bank.

1263
00:52:31.305 --> 00:52:34.645
But it also raised very interesting questions about that,

1264
00:52:34.645 --> 00:52:37.565
that we could take back to our content experts

1265
00:52:37.785 --> 00:52:40.045
and say, are we comfortable with this?

1266
00:52:40.305 --> 00:52:42.285
Or by piling up questions

1267
00:52:42.285 --> 00:52:44.365
where it seems like the alcohol is being correlated

1268
00:52:44.365 --> 00:52:45.405
with the gender of the patient.

1269
00:52:46.025 --> 00:52:50.045
Do we risk having a, a, you know, non intended presentation

1270
00:52:50.505 --> 00:52:53.525
of patient vignettes when you look at an entire exam form

1271
00:52:53.545 --> 00:52:54.685
or an entire item bank?

1272
00:52:55.145 --> 00:52:58.725
So in that case, we were, we were putting a little twist on,

1273
00:52:58.865 --> 00:52:59.925
on looking at bias

1274
00:53:00.505 --> 00:53:01.565
and figuring out a way

1275
00:53:01.565 --> 00:53:04.085
that it would've been really challenging for humans to look

1276
00:53:04.085 --> 00:53:07.485
for that and reviewing each individual item one at a time

1277
00:53:07.725 --> 00:53:09.325
wouldn't have necessarily caught that.

1278
00:53:10.285 --> 00:53:11.285
Hmm,

1279
00:53:12.685 --> 00:53:13.685
That's fascinating.

1280
00:53:15.185 --> 00:53:17.955
Alright, so moving on to the next question, uh,

1281
00:53:17.975 --> 00:53:19.115
in order of up votes.

1282
00:53:19.695 --> 00:53:22.155
So looking outside of testing, where is ETS

1283
00:53:22.155 --> 00:53:24.195
and NBME focusing on the use of AI?

1284
00:53:24.615 --> 00:53:26.925
Uh, everyone seems to focus on the content generation,

1285
00:53:26.925 --> 00:53:28.485
but there are many other functions.

1286
00:53:28.515 --> 00:53:31.045
Just curious what else is important to you?

1287
00:53:32.205 --> 00:53:34.985
Um, Kimberly I'll stick with you, uh,

1288
00:53:35.345 --> 00:53:37.305
just to go in reverse order. Um,

1289
00:53:38.205 --> 00:53:40.585
You know, so I, I I will mention our research agenda.

1290
00:53:41.005 --> 00:53:44.945
Um, not surprisingly, automated scoring is on there,

1291
00:53:44.945 --> 00:53:46.705
which is one reason I'm very excited about it

1292
00:53:46.935 --> 00:53:49.625
because we are considering innovative item types

1293
00:53:49.775 --> 00:53:52.905
that are not, that are, are moving beyond the MCQs.

1294
00:53:53.285 --> 00:53:54.705
And I presented on one type

1295
00:53:54.765 --> 00:53:57.465
or sharp items that have a, a free text component

1296
00:53:57.805 --> 00:54:00.145
and uh, indicate, you know, information

1297
00:54:00.145 --> 00:54:01.185
that gives us some idea

1298
00:54:01.185 --> 00:54:03.265
of your clinical reasoning component.

1299
00:54:03.365 --> 00:54:07.305
But we are also, um, piloting, uh, an app

1300
00:54:07.305 --> 00:54:10.265
to look at communication skills where an examinee is, um,

1301
00:54:10.455 --> 00:54:12.465
viewing a video of a standardized patient

1302
00:54:13.005 --> 00:54:14.585
and then the OSCEs, um,

1303
00:54:14.685 --> 00:54:16.545
the objective standardized clinical exams

1304
00:54:16.855 --> 00:54:19.505
that involve interacting with standardized patients.

1305
00:54:20.165 --> 00:54:23.185
Um, all of those are things where if we rolled it out

1306
00:54:23.185 --> 00:54:25.185
before in some context, we either had

1307
00:54:25.185 --> 00:54:26.465
to make it fit in an MCQ

1308
00:54:26.925 --> 00:54:29.425
or in the case of Step 2 CS, it was a live exam

1309
00:54:29.425 --> 00:54:30.425
with live SPs.

1310
00:54:31.005 --> 00:54:34.145
So I am super excited that we have, you know,

1311
00:54:34.625 --> 00:54:37.385
research in those areas that is all moving rapidly

1312
00:54:37.925 --> 00:54:40.545
and that we've been able to host Kaggle competitions.

1313
00:54:41.035 --> 00:54:42.425
We've been able to make, uh,

1314
00:54:42.425 --> 00:54:44.665
patient note data from our old exams

1315
00:54:44.665 --> 00:54:46.545
available for others to use.

1316
00:54:46.605 --> 00:54:48.345
So to get back to, I believe it was

1317
00:54:48.345 --> 00:54:50.745
what Kara said about suddenly everybody's working together

1318
00:54:51.125 --> 00:54:53.665
and you have a lot more cooks in the kitchen, um, you know,

1319
00:54:53.665 --> 00:54:55.185
that's been exciting as well for us.

1320
00:54:56.625 --> 00:54:59.815
Great. I'll just quickly add for the purposes of time, one

1321
00:54:59.815 --> 00:55:01.575
of the big areas that we're thinking outside

1322
00:55:01.575 --> 00:55:03.015
of testing is really on, um,

1323
00:55:03.015 --> 00:55:05.495
supporting individuals on their non-linear path

1324
00:55:05.495 --> 00:55:06.895
between education and the workforce.

1325
00:55:07.155 --> 00:55:08.255
Moving between education

1326
00:55:08.255 --> 00:55:09.655
and the workforce is incredibly important.

1327
00:55:09.715 --> 00:55:11.815
And so we're developing tools that, um,

1328
00:55:11.815 --> 00:55:14.575
help individuals understand skills they had, they didn't

1329
00:55:14.575 --> 00:55:17.255
otherwise know they had, and connect them to opportunities.

1330
00:55:17.515 --> 00:55:18.775
So it kind of goes with

1331
00:55:18.775 --> 00:55:20.015
what we've all been talking about here.

1332
00:55:20.235 --> 00:55:21.975
If you're interested, you can go to the app store

1333
00:55:21.995 --> 00:55:23.855
and download an app called Authentic.

1334
00:55:24.115 --> 00:55:25.775
The Authentic app, um,

1335
00:55:25.915 --> 00:55:27.935
is an interview prep tool, for example.

1336
00:55:28.045 --> 00:55:31.055
Basically it helps an individual who might be interviewing

1337
00:55:31.055 --> 00:55:33.255
for a data science role in a senior level next week.

1338
00:55:33.255 --> 00:55:34.535
They can go in the app, they can say

1339
00:55:34.535 --> 00:55:36.895
what role they're applying for, what level it's at,

1340
00:55:37.045 --> 00:55:38.895
what are the interview questions, they wanna, um,

1341
00:55:39.135 --> 00:55:40.655
practice, practice those questions.

1342
00:55:40.715 --> 00:55:43.815
The MMAI behind it gives them feedback on their effective

1343
00:55:44.015 --> 00:55:45.575
communication, et cetera, and helps

1344
00:55:45.735 --> 00:55:46.855
'em prepare for those interviews.

1345
00:55:47.115 --> 00:55:48.935
So it's really about, um, you know,

1346
00:55:48.935 --> 00:55:50.175
is it really about human progress?

1347
00:55:50.175 --> 00:55:51.535
It's powering human progress

1348
00:55:51.555 --> 00:55:53.375
and we're thinking beyond assessment to enable that

1349
00:55:55.745 --> 00:55:56.745
Love, that Authentic. Is

1350
00:55:56.745 --> 00:55:58.595
that you, you spell that out for us, right?

1351
00:55:59.025 --> 00:56:02.435
Authentic. A-U-T-H-E-N-I-C interview.

1352
00:56:02.505 --> 00:56:04.755
Okay. The way it's actually spelled. Okay. Yes. Perfect.

1353
00:56:05.375 --> 00:56:08.115
You know, sometimes organizations throw cues in there

1354
00:56:08.295 --> 00:56:09.995
and just Okay, good to know.

1355
00:56:10.075 --> 00:56:13.475
I know. Alright, this is an interesting question about how

1356
00:56:13.475 --> 00:56:16.235
to get your team started using AI in general, just

1357
00:56:16.235 --> 00:56:19.115
because some folks aren't interested even

1358
00:56:19.115 --> 00:56:20.315
with knowing the benefits.

1359
00:56:20.615 --> 00:56:22.755
So, um, how to generate interest.

1360
00:56:22.815 --> 00:56:24.125
So Pat I'll start with you

1361
00:56:24.125 --> 00:56:26.005
and then I'll, I'll see what Andre has to say as well.

1362
00:56:26.165 --> 00:56:28.045
I was gonna say Andre probably is better at this

1363
00:56:28.325 --> 00:56:31.205
'cause he, his business actually is involved with some

1364
00:56:31.205 --> 00:56:34.445
of this, but, uh, but yeah, I mean, for me, um, you,

1365
00:56:34.465 --> 00:56:36.005
you know, first of all, you can't force it.

1366
00:56:36.105 --> 00:56:39.885
You, um, I made the tools available, um, I encourage them

1367
00:56:39.885 --> 00:56:44.605
to do it and, um, um, and then, and then we made it.

1368
00:56:44.665 --> 00:56:47.005
So we gave 'em a lot of freedom, um, and,

1369
00:56:47.065 --> 00:56:48.925
and the word of mouth actually took over.

1370
00:56:49.385 --> 00:56:52.805
Um, and it, it was much more, um, it was,

1371
00:56:52.865 --> 00:56:55.325
it really turned out to be incredibly useful the way

1372
00:56:55.325 --> 00:56:58.285
everybody sort of started talking and sharing tools and,

1373
00:56:58.625 --> 00:57:00.765
and ideas and how to improve things.

1374
00:57:00.825 --> 00:57:02.445
And, uh, it's, it's been very cool.

1375
00:57:02.825 --> 00:57:05.555
So I'll let Andre wrap this one up though.

1376
00:57:05.895 --> 00:57:07.915
Yes. Yeah, I, I was gonna say, if,

1377
00:57:07.915 --> 00:57:11.155
if the environment is not that of a little bit of creativity

1378
00:57:11.155 --> 00:57:13.115
and entrepreneurship, you'll be hard pressed

1379
00:57:13.115 --> 00:57:14.755
with this concept to change that culture.

1380
00:57:15.375 --> 00:57:17.075
Um, but I'm gonna put, uh, me

1381
00:57:17.075 --> 00:57:18.115
and Jack are very good friends.

1382
00:57:18.305 --> 00:57:19.875
I'll put us on, on the spot here.

1383
00:57:19.875 --> 00:57:22.675
The way it got introduced in our company is that, um,

1384
00:57:22.975 --> 00:57:24.715
you know, I went out to Chat GPT

1385
00:57:24.715 --> 00:57:27.275
and I said, generate me a 20 item test to, to measure this.

1386
00:57:27.275 --> 00:57:30.235
And I sent that to him and I think it frustrated him so bad.

1387
00:57:30.235 --> 00:57:31.875
He says, I gotta figure out this thing, right?

1388
00:57:32.455 --> 00:57:34.155
And then he says, you know, they usually come back

1389
00:57:34.155 --> 00:57:35.235
with the scientific answer

1390
00:57:35.415 --> 00:57:37.035
and then I just re-asked the question back

1391
00:57:37.095 --> 00:57:40.195
and it sort of rejiggered it, so it sent it to him again.

1392
00:57:40.195 --> 00:57:42.755
And it was, and so I think what happened is it opened his,

1393
00:57:42.815 --> 00:57:44.875
his eyes to how powerful it is.

1394
00:57:45.015 --> 00:57:46.635
So, um, a little bit of challenge.

1395
00:57:46.755 --> 00:57:48.715
A little bit of tension is not a bad thing,

1396
00:57:48.735 --> 00:57:50.635
but, uh, I wouldn't force it. Mm-Hmm.

1397
00:57:51.195 --> 00:57:52.915
I would also recommend a Hackathon if you

1398
00:57:52.915 --> 00:57:54.155
haven't run a Hackathon in your

1399
00:57:54.155 --> 00:57:55.155
organization. Those

1400
00:57:55.155 --> 00:57:57.675
are fun. Um, get a lot of people involved.

1401
00:57:58.585 --> 00:58:02.035
Yeah. I also think it might've been you, Kimberly,

1402
00:58:02.035 --> 00:58:03.635
who mentioned earlier, finding tasks

1403
00:58:03.665 --> 00:58:06.075
that the humans hate doing and starting there as well,

1404
00:58:06.105 --> 00:58:08.435
because I think that can be a great incentive. Yeah.

1405
00:58:08.435 --> 00:58:11.035
I mean, it, it's what you want to, um,

1406
00:58:12.585 --> 00:58:13.995
advertise about it is it,

1407
00:58:13.995 --> 00:58:15.515
it's not intended to replace anyone.

1408
00:58:15.585 --> 00:58:18.155
It's not intended to radically revamp

1409
00:58:18.155 --> 00:58:19.195
everything right off the bat.

1410
00:58:19.225 --> 00:58:21.235
It's intended to help humans do things better

1411
00:58:21.345 --> 00:58:23.555
that humans think are important to do better.

1412
00:58:23.865 --> 00:58:25.995
Therefore, if you emphasize that part of it.

1413
00:58:26.215 --> 00:58:27.715
But I also would say you do need

1414
00:58:27.955 --> 00:58:29.235
advocates within your organization.

1415
00:58:29.415 --> 00:58:30.555
So if you see a spark

1416
00:58:30.935 --> 00:58:32.995
or someone's like, Hey, I think this might be interested,

1417
00:58:32.995 --> 00:58:35.355
but I don't know that much about it yet, nurture that.

1418
00:58:35.385 --> 00:58:38.275
Yeah. Foster that. Get them so that it, you know,

1419
00:58:38.275 --> 00:58:40.635
otherwise you could risk sounding like a broken record.

1420
00:58:40.975 --> 00:58:42.275
You definitely need advocates

1421
00:58:42.275 --> 00:58:44.395
and other people on board to talk about the potentials.

1422
00:58:44.695 --> 00:58:48.365
Mm-Hmm. Wonderful. All right.

1423
00:58:48.425 --> 00:58:50.165
Uh, we are at 12:59, so,

1424
00:58:50.265 --> 00:58:51.925
and there are still four questions to go.

1425
00:58:52.085 --> 00:58:53.805
So I don't think we're gonna have time to get

1426
00:58:54.025 --> 00:58:55.445
to all of the questions today.

1427
00:58:55.445 --> 00:58:57.525
Unfortunately. I wanna respect all of your time.

1428
00:58:58.145 --> 00:59:00.245
Um, but I, I do want

1429
00:59:00.245 --> 00:59:02.285
to once again thank our amazing panelists

1430
00:59:02.285 --> 00:59:03.285
for joining us here today.

1431
00:59:03.705 --> 00:59:05.205
Pat Ward, Andre Allen,

1432
00:59:05.315 --> 00:59:06.925
Kara McWilliams, and Kimberly Swygert.

1433
00:59:06.925 --> 00:59:08.325
Thank you so much for joining us

1434
00:59:09.025 --> 00:59:11.365
and uh, I'll share my screen one more time so

1435
00:59:11.365 --> 00:59:13.005
that you all know, um,

1436
00:59:13.225 --> 00:59:15.725
how you can reach our lovely panelists

1437
00:59:15.725 --> 00:59:17.245
after the webinar if you would like

1438
00:59:17.245 --> 00:59:18.685
to continue the conversation.

1439
00:59:19.305 --> 00:59:21.365
So thank you so much again,

1440
00:59:21.505 --> 00:59:23.165
and once again, the recording will be available.

1441
00:59:23.215 --> 00:59:24.365
Don't click away too quickly.

1442
00:59:24.385 --> 00:59:27.205
You'll have access to a survey to tell us your thoughts.

1443
00:59:28.085 --> 00:59:29.545
Thanks, Sandy. Thank you. Thank

1444
00:59:29.545 --> 00:59:30.545
You Sandy. Thank you so much, Sandy,

1445
00:59:30.545 --> 00:59:31.265
for all your

1446
00:59:31.265 --> 00:59:32.265
Efforts. Thank you so

1447
00:59:32.265 --> 00:59:34.705
much. Alright, good job. Take care everybody.

1448
00:59:35.245 --> 00:59:37.305
Thanks. See ya. Bye everybody.