RadiantLogic-Cisco-Dashboard-Reporting-Hero

Trusted Identities in the Age of Zero Trust

Find out how to eliminate 96% of identity breaches without tearing down and replacing your existing systems. Wade Ellery and John Ross Petrutiu from Radiant Logic show you how to secure access, reduce identity-related risks and automate IAM controls. Concrete solutions, Zero Trust security, proven results you can count on.

Read the transcript

Well, I think we can
probably kick off.

2
00:00:04,640 –> 00:00:05,520
Okay.

3
00:00:05,520 –> 00:00:09,400
And are we recording? Make
sure that’s happening.

4
00:00:11,200 –> 00:00:12,800
Looks like we’re good
to go, Wade. Yeah.

5
00:00:12,800 –> 00:00:14,200
It looks like we’re recording.

6
00:00:16,365 –> 00:00:19,245
All right. Well, thank you
everyone for joining us today.

7
00:00:19,245 –> 00:00:23,165
This is RadiantLogic’s
webinar on Trusted Identities

8
00:00:23,165 –> 00:00:25,885
in the Age of Zero Trust.

9
00:00:25,885 –> 00:00:27,325
And what’s interesting to me,

10
00:00:27,325 –> 00:00:29,800
or actually very positive
experience for me,

11
00:00:29,800 –> 00:00:32,360
is I think we are now finally
in the age of zero trust.

12
00:00:32,360 –> 00:00:35,560
I kind of saw it coming
for quite a while,

13
00:00:35,560 –> 00:00:38,280
and I’m very excited now to see
it actually getting tremendous

14
00:00:38,280 –> 00:00:41,305
amounts of traction
in the business area.

15
00:00:41,305 –> 00:00:44,265
So we’re going to talk today
about really the criticality of

16
00:00:44,265 –> 00:00:46,745
identities when you start
talking about a Zero Trust,

17
00:00:46,745 –> 00:00:49,785
or really anything you do in
the identity management space

18
00:00:49,785 –> 00:00:52,985
where identity becomes the
foundation and really the

19
00:00:52,985 –> 00:00:55,985
linchpin for your complete
security platform.

20
00:00:56,040 –> 00:01:00,040
I’m joined today by JR,

21
00:01:00,040 –> 00:01:04,000
who is one of our senior
can I go to the next slide?

22
00:01:04,120 –> 00:01:05,320
Absolutely.

23
00:01:05,320 –> 00:01:06,640
Yeah.

24
00:01:07,080 –> 00:01:09,960
He’s our Senior Solutions
Consultant with Radian Logic.

25
00:01:09,960 –> 00:01:11,435
My name is Wade Ellery.

26
00:01:11,435 –> 00:01:14,635
I am actually the field
CTO at Radian Logic,

27
00:01:14,635 –> 00:01:19,355
and we’ll be presenting today
insights into the impact

28
00:01:19,355 –> 00:01:23,435
of identity in the
security model and how to

29
00:01:23,435 –> 00:01:26,155
build out your platform to
increase overall platform

30
00:01:26,155 –> 00:01:31,160
security, focusing on identity is
the linchpin again for that model.

31
00:01:35,760 –> 00:01:38,640
So what we’re realizing now,

32
00:01:38,640 –> 00:01:41,265
and we’ve sort of seen
an evolution in identity

33
00:01:41,265 –> 00:01:42,705
management over the decades.

34
00:01:42,705 –> 00:01:45,745
We started out with single sign
on. We went to provisioning.

35
00:01:45,745 –> 00:01:49,025
We went to governance to solve
problems within the identity

36
00:01:49,025 –> 00:01:51,770
infrastructure and make
business more effective.

37
00:01:51,770 –> 00:01:52,970
But what’s happening now,

38
00:01:52,970 –> 00:01:54,970
and we see it in the
news every day today,

39
00:01:54,970 –> 00:01:59,770
is that breaches are the number one
fault within organizations today.

40
00:01:59,770 –> 00:02:01,370
It’s no longer
business efficiency.

41
00:02:01,370 –> 00:02:04,010
It’s no longer access
to applications.

42
00:02:04,010 –> 00:02:06,810
It’s the threat that an
outsider is going to come in

43
00:02:06,810 –> 00:02:09,665
and either ransomware
your environment,

44
00:02:09,665 –> 00:02:11,265
lock it up so you can’t use it,

45
00:02:11,265 –> 00:02:12,865
steal your proprietary
information,

46
00:02:12,865 –> 00:02:14,545
or steal your identity data.

47
00:02:14,545 –> 00:02:17,105
And this is the number
one challenge now.

48
00:02:17,105 –> 00:02:19,825
Ninety percent of the
organizations out there right

49
00:02:19,825 –> 00:02:22,790
now will or have been breached.

50
00:02:22,790 –> 00:02:25,670
And these breaches can be
traced back these days,

51
00:02:25,670 –> 00:02:28,390
currently, to the
compromise of an account,

52
00:02:28,390 –> 00:02:31,110
the compromise of an
identity, a user identity,

53
00:02:31,110 –> 00:02:33,310
a service account,
a local account,

54
00:02:33,385 –> 00:02:36,105
some machine account
or nonhuman entity.

55
00:02:36,105 –> 00:02:40,025
These are the entryways now
into the network where we used

56
00:02:40,025 –> 00:02:42,665
to have to find our
way through a firewall,

57
00:02:42,665 –> 00:02:44,185
where really someone
said recently,

58
00:02:44,185 –> 00:02:45,865
someone doesn’t have
to get on your network.

59
00:02:45,865 –> 00:02:48,390
They just have to log
in because everything is so

60
00:02:48,390 –> 00:02:51,910
distributed now that
compromising account is the way

61
00:02:51,910 –> 00:02:53,430
into the system.

62
00:02:53,430 –> 00:02:55,430
So how do you counter
something like that?

63
00:02:55,430 –> 00:02:58,350
How do you deal
with the idea that

64
00:02:59,270 –> 00:03:02,230
a compromise in your environment
is so easy to take on?

65
00:03:02,230 –> 00:03:04,155
JR, how would you approach that?

66
00:03:04,155 –> 00:03:05,515
JR: Absolutely. That’s
a good question.

67
00:03:05,515 –> 00:03:08,475
So I think just to
your point, Wade,

68
00:03:08,475 –> 00:03:12,795
what’s happened here is an
evolution of the security perimeter.

69
00:03:12,795 –> 00:03:14,795
And when we’re talking about
the traditional security

70
00:03:14,795 –> 00:03:18,220
perimeter, initially you were
talking about network boundaries.

71
00:03:18,220 –> 00:03:20,620
The idea is that you have
a clear distinction between

72
00:03:20,620 –> 00:03:22,820
inside or outside the org.

73
00:03:23,100 –> 00:03:24,140
Over the last few years,

74
00:03:24,140 –> 00:03:28,140
we’ve seen an evolution away from
that model to, like you said,

75
00:03:28,140 –> 00:03:31,260
a distributed model where
resources exist in many

76
00:03:31,260 –> 00:03:34,575
different locations, both
physically and in different

77
00:03:34,575 –> 00:03:36,895
cloud repositories, for example.

78
00:03:36,895 –> 00:03:40,975
And users themselves aren’t
always working from within a

79
00:03:40,975 –> 00:03:42,815
physical building.

80
00:03:42,815 –> 00:03:45,600
A lot of work from home, work
from anywhere type approach.

81
00:03:45,600 –> 00:03:49,280
And so, we see here is
a change in the threat

82
00:03:49,280 –> 00:03:52,000
landscape which is away
from kind of the traditional

83
00:03:52,000 –> 00:03:55,680
perimeter towards identity
as the perimeter itself.

84
00:03:55,680 –> 00:03:57,960
The idea being
that, like you said,

85
00:03:58,245 –> 00:04:01,125
if a user account or
a service account,

86
00:04:01,125 –> 00:04:03,445
admin account is compromised,

87
00:04:03,445 –> 00:04:06,005
that will lead to the ability
to compromise additional

88
00:04:06,005 –> 00:04:08,645
services across
the organization.

89
00:04:08,645 –> 00:04:12,325
And the real shift here is
that threat actors have also

90
00:04:12,325 –> 00:04:16,460
realized that it’s
actually much easier to

91
00:04:16,460 –> 00:04:20,020
compromise an account than
it is to compromise software.

92
00:04:20,060 –> 00:04:21,180
So, for example,

93
00:04:21,180 –> 00:04:24,140
if you have a lot of different
tools in place like single sign

94
00:04:24,140 –> 00:04:28,220
on, MFA IGA to secure
some of the accesses to

95
00:04:28,220 –> 00:04:30,845
accounts, it then
becomes a question of,

96
00:04:30,845 –> 00:04:34,765
well, is it easier to try
to find a flaw in a piece of

97
00:04:34,765 –> 00:04:38,845
software or is it easier to
social engineer your way into

98
00:04:38,845 –> 00:04:42,565
access to an account which
has legitimate access rights

99
00:04:42,605 –> 00:04:45,950
but by being able to
impersonate a user,

100
00:04:45,950 –> 00:04:48,590
you gain access to critical
resources within the company

101
00:04:48,590 –> 00:04:50,270
rather than having to, again,

102
00:04:50,270 –> 00:04:52,350
find a flaw within the software.

103
00:04:52,350 –> 00:04:54,910
It’s really about what’s the
simplest for the threat actor,

104
00:04:54,910 –> 00:04:55,790
what’s the easiest,

105
00:04:55,790 –> 00:04:57,995
quickest approach with the
least amount of effort.

106
00:04:57,995 –> 00:04:59,835
And generally speaking nowadays,

107
00:04:59,835 –> 00:05:02,115
that’s compromising
the identity.

108
00:05:02,635 –> 00:05:08,155
It sounds like that adage of you don’t
have to run faster than the bear.

109
00:05:08,155 –> 00:05:10,800
You just have to run faster
than your buddy next to you

110
00:05:10,800 –> 00:05:12,640
when the bear is chasing you.

111
00:05:12,640 –> 00:05:16,240
So you want to actually look
and see what your weakest point

112
00:05:16,240 –> 00:05:17,280
is in your network.

113
00:05:17,280 –> 00:05:20,160
And over time, that weakest
point has, it sounds like,

114
00:05:20,160 –> 00:05:23,320
evolved to be the
account itself.

115
00:05:23,360 –> 00:05:24,440
Correct.

116
00:05:25,795 –> 00:05:28,355
So it sounds like we’re hearing
about this not just from our

117
00:05:28,355 –> 00:05:29,875
own experiences,

118
00:05:29,875 –> 00:05:32,195
as you and I have had
quite a few in this area,

119
00:05:32,195 –> 00:05:34,675
but we’re hearing it
from across the industry,

120
00:05:34,675 –> 00:05:38,200
from large customers,
from analysts,

121
00:05:38,200 –> 00:05:39,400
from governing bodies.

122
00:05:39,400 –> 00:05:42,520
They’re all coming back and
basically validating the same

123
00:05:42,520 –> 00:05:46,840
concern that identity is
now the only place left or

124
00:05:46,840 –> 00:05:49,640
the critical piece that you
have to secure in order to

125
00:05:49,640 –> 00:05:50,760
secure your environment.

126
00:05:50,760 –> 00:05:53,915
And unfortunately,
over a lot of time,

127
00:05:53,915 –> 00:05:57,115
we have created a tremendous
amount of identity debt.

128
00:05:57,115 –> 00:05:59,995
A lot of it centered
around identity itself,

129
00:05:59,995 –> 00:06:03,755
and that has caused a
tremendous increase in the

130
00:06:03,755 –> 00:06:05,510
vulnerability of identity,

131
00:06:05,510 –> 00:06:07,750
especially when you talk
about service accounts,

132
00:06:07,750 –> 00:06:09,350
as you did a minute ago,

133
00:06:09,350 –> 00:06:12,230
the capability of service
accounts to have a higher level

134
00:06:12,230 –> 00:06:15,670
of access but a more anonymous
kind of operation within the

135
00:06:15,670 –> 00:06:19,150
network and much less likely
to be governed and managed,

136
00:06:19,165 –> 00:06:21,485
those become very
susceptible to that.

137
00:06:21,485 –> 00:06:24,285
And as mentioned in a couple
of the quotes here from these

138
00:06:24,285 –> 00:06:28,525
leaders in the industry,
the advent of the cloud now,

139
00:06:28,525 –> 00:06:31,085
the distribution of our
identities now outside our

140
00:06:31,085 –> 00:06:35,090
organization and into other
trusted organizations creates a

141
00:06:35,090 –> 00:06:39,970
real challenge now of being able to
secure myself within a third party.

142
00:06:39,970 –> 00:06:43,530
How do I even make sure
that’s as secure as it can be?

143
00:06:43,810 –> 00:06:44,625
Yep.

144
00:06:44,625 –> 00:06:47,585
I mean, just to add to that,
the other thing is, again,

145
00:06:47,585 –> 00:06:49,985
because of the
evolving landscape,

146
00:06:49,985 –> 00:06:53,385
what we see increasingly is
that traditional measures,

147
00:06:53,745 –> 00:06:54,785
like we were talking about,

148
00:06:54,785 –> 00:06:57,950
the traditional network
perimeter or even traditional

149
00:06:57,950 –> 00:06:59,630
tools nowadays.

150
00:06:59,630 –> 00:07:02,710
You look at the
classic IGA approach.

151
00:07:02,830 –> 00:07:07,310
In essence, IGA is, it’s
glorified automation to a

152
00:07:07,310 –> 00:07:09,870
large degree where you’re
automating the onboarding,

153
00:07:09,870 –> 00:07:11,230
offboarding tasks.

154
00:07:11,230 –> 00:07:13,605
There is some level
of analytics included

155
00:07:13,605 –> 00:07:19,365
in most traditional IGA
platforms and approaches.

156
00:07:19,365 –> 00:07:23,045
But what it ends up with a
lot of the time is kind of a

157
00:07:23,045 –> 00:07:24,885
deferred time change.

158
00:07:24,885 –> 00:07:28,760
So, it may take a long time for
changes to propagate across the org.

159
00:07:28,760 –> 00:07:29,960
And additionally,

160
00:07:29,960 –> 00:07:33,160
you end up with somewhat of
a static authorization model

161
00:07:33,160 –> 00:07:35,720
where through your governance
engine you’re building things

162
00:07:35,720 –> 00:07:37,880
like static groups and roles,

163
00:07:37,880 –> 00:07:40,785
which while maybe dynamically
populated based off of a set of

164
00:07:40,785 –> 00:07:45,825
conditions, necessarily are
making a decision before

165
00:07:45,825 –> 00:07:48,225
the time of access,
right, a lot of the time.

166
00:07:48,225 –> 00:07:51,185
And we’re talking about kind of
the role based access control

167
00:07:51,185 –> 00:07:52,905
or group access control.

168
00:07:53,790 –> 00:07:57,070
So it sounds like we have a lot
of inherent challenges just in

169
00:07:57,070 –> 00:07:59,710
the way that things have been
engineered and evolved over time.

170
00:07:59,710 –> 00:08:03,230
So let’s take a look and see
what we can do to try and heal

171
00:08:03,230 –> 00:08:05,070
this wound.

172
00:08:05,070 –> 00:08:06,230
So

173
00:08:06,695 –> 00:08:08,775
clearly, what happens
when identity fails?

174
00:08:08,775 –> 00:08:12,615
Well, this is the classic,
compromise of a network.

175
00:08:12,615 –> 00:08:15,655
In fact, I do a lot of
commentary for our organization

176
00:08:15,655 –> 00:08:16,775
on events.

177
00:08:16,775 –> 00:08:21,095
There are, breaking news of
a breach somewhere in NIH

178
00:08:21,095 –> 00:08:24,670
or someplace else, and
they look for commentary

179
00:08:24,670 –> 00:08:28,990
from industry professionals
to highlight either the

180
00:08:28,990 –> 00:08:33,470
causes or the effects or the
continuous nature of some of these.

181
00:08:33,470 –> 00:08:37,095
And I’m now getting on a daily
basis from our marketing team a

182
00:08:37,095 –> 00:08:39,335
request for comment on a breach.

183
00:08:39,335 –> 00:08:41,735
It is becoming ubiquitous now.

184
00:08:41,735 –> 00:08:44,535
What was interesting to me was
a quote that I read recently

185
00:08:44,535 –> 00:08:46,615
that said that two years ago,

186
00:08:46,615 –> 00:08:49,895
it used to take about a month
for an attacker to get into

187
00:08:49,895 –> 00:08:52,710
your network to compromise
enough resources to gain some

188
00:08:52,710 –> 00:08:56,630
level of control and be
able to take over either

189
00:08:56,630 –> 00:09:00,150
access to vital
information, identity data,

190
00:09:00,150 –> 00:09:01,990
and putting that
on the dark web,

191
00:09:01,990 –> 00:09:04,845
or locking up the platform
with a ransomware model.

192
00:09:04,845 –> 00:09:06,845
But now with the advent of AI,

193
00:09:06,845 –> 00:09:10,445
with the democratization of
attacks now with platforms for

194
00:09:10,445 –> 00:09:13,725
rent on the dark web to be
able to do these attacks,

195
00:09:13,725 –> 00:09:17,245
you’re down to less than
thirty minutes to compromise an

196
00:09:17,245 –> 00:09:20,725
account and be able to gain
control in an environment.

197
00:09:20,900 –> 00:09:24,500
So if you’ve got a platform in
place that’s doing your audit

198
00:09:24,500 –> 00:09:28,180
and review and access
verification on a twenty four

199
00:09:28,180 –> 00:09:32,380
hour cycle because it takes that
long to load the data every night,

200
00:09:32,420 –> 00:09:34,420
you’re going to miss something
where I come in at eight in the

201
00:09:34,420 –> 00:09:36,425
morning, I work for a few
hours in your environment,

202
00:09:36,425 –> 00:09:39,385
I grab everything I need, I
hide myself, I clear the logs,

203
00:09:39,385 –> 00:09:41,305
and I’m gone by noon.

204
00:09:41,305 –> 00:09:42,905
And you never saw me.

205
00:09:42,905 –> 00:09:45,305
And I can come and do that
over and over again because I’m

206
00:09:45,305 –> 00:09:49,385
undetected by particular
systems today.

207
00:09:49,385 –> 00:09:54,060
So really, JR, what do we do in
a situation like that when our

208
00:09:54,060 –> 00:09:57,900
existing eyes on
glass can’t see the

209
00:09:57,900 –> 00:10:01,740
actual negative events that are
taking place in the background?

210
00:10:01,740 –> 00:10:03,380
That’s a good question.

211
00:10:03,745 –> 00:10:05,745
I would actually extend
that a little bit and say,

212
00:10:05,745 –> 00:10:08,305
what do we do in a situation
where the tools in place do

213
00:10:08,305 –> 00:10:12,225
detect a threat but aren’t able
to take action quick enough?

214
00:10:12,225 –> 00:10:15,425
And that’s really the scenario that
we’re illustrating in this slide.

215
00:10:15,425 –> 00:10:19,130
This is an actual use case that
we saw before deploying with

216
00:10:19,130 –> 00:10:20,930
a customer of ours.

217
00:10:20,970 –> 00:10:23,690
They were in a situation
where they had a breach,

218
00:10:23,690 –> 00:10:26,410
their SOC did detect that
there was an event happening.

219
00:10:26,410 –> 00:10:29,690
It took them under thirty
minutes to cut access for the

220
00:10:29,690 –> 00:10:31,745
compromised accounts
in that scenario.

221
00:10:31,745 –> 00:10:35,345
But the real issue was that
within those thirty minutes,

222
00:10:35,345 –> 00:10:37,985
bad actor was able to
exfiltrate multiple terabytes

223
00:10:37,985 –> 00:10:42,465
worth of data essentially onto
their own machines and then

224
00:10:42,465 –> 00:10:45,785
were able to shell some of that
data and leak the rest of it.

225
00:10:46,330 –> 00:10:48,090
But back to your question, Wade,

226
00:10:48,090 –> 00:10:49,450
what can you do about this?

227
00:10:49,450 –> 00:10:52,410
It highlights the need for a
more preventative approach to

228
00:10:52,410 –> 00:10:54,970
security because just in time,

229
00:10:54,970 –> 00:10:58,010
real time is simply
not fast enough.

230
00:10:58,010 –> 00:11:01,185
And so what this highlights is
the need to detect ahead of an

231
00:11:01,185 –> 00:11:03,585
attack before an attack
can even take place,

232
00:11:03,585 –> 00:11:06,065
any vulnerabilities or
issues with accounts,

233
00:11:06,065 –> 00:11:07,345
over allocated access,

234
00:11:07,345 –> 00:11:10,545
things like that in order
to rectify those issues so that

235
00:11:10,545 –> 00:11:13,910
even when an account
does get compromised,

236
00:11:13,910 –> 00:11:17,030
because it will happen
inevitably, in that scenario,

237
00:11:17,030 –> 00:11:20,550
you can minimize ahead of time
the impact that compromise or

238
00:11:20,550 –> 00:11:21,990
that breach will have.

239
00:11:21,990 –> 00:11:25,590
So that’s really focusing on
a more preventative approach

240
00:11:25,590 –> 00:11:29,995
rather than just a response
to an event as it’s happening.

241
00:11:30,435 –> 00:11:33,715
So it sounds like really even
though you’ve got excellent

242
00:11:33,715 –> 00:11:36,915
measures to counter a burglar
once he’s in your house,

243
00:11:36,915 –> 00:11:38,915
you’d really rather live in
a world where you prevent the

244
00:11:38,915 –> 00:11:40,475
burglar from getting in.

245
00:11:40,800 –> 00:11:44,080
The fact that you have to react
to them and fight them off is

246
00:11:44,080 –> 00:11:45,600
the last resort.

247
00:11:45,600 –> 00:11:49,920
If you can do something that
actually prevents that and cuts

248
00:11:49,920 –> 00:11:52,880
down on the capability of
that person to compromise your

249
00:11:52,880 –> 00:11:55,160
network, then you’re really
ahead of the game here.

250
00:11:55,195 –> 00:11:57,595
You’re no longer
reactive. You’re being preventative.

251
00:11:57,595 –> 00:12:01,595
And that seems to be a great
way to take this conversation.

252
00:12:01,595 –> 00:12:04,235
But is that something
that we can do?

253
00:12:04,235 –> 00:12:07,595
Because it sounds like
that identity right now is

254
00:12:07,595 –> 00:12:09,035
apparently highly compromised.

255
00:12:09,035 –> 00:12:11,690
If we were easily able
to solve this problem,

256
00:12:11,690 –> 00:12:14,890
we wouldn’t have it as such
a ubiquitous problem today,

257
00:12:14,890 –> 00:12:17,450
where three out of four
organizations are compromised

258
00:12:17,450 –> 00:12:20,090
and eighty six percent of
them are done for some kind of

259
00:12:20,090 –> 00:12:21,730
overprivileged account.

260
00:12:21,955 –> 00:12:25,075
So what is the step forward,

261
00:12:25,075 –> 00:12:28,115
or what is the method we want
to take a look at now to try

262
00:12:28,115 –> 00:12:30,115
and harden our identities?

263
00:12:30,115 –> 00:12:31,315
What can we do?

264
00:12:31,315 –> 00:12:33,235
Are we ripping, replacing
everything we have,

265
00:12:33,235 –> 00:12:37,115
or is there another way to make
this a more successful endeavor?

266
00:12:37,150 –> 00:12:38,670
That’s a good question.

267
00:12:38,670 –> 00:12:41,310
So I think to begin to
answer that question,

268
00:12:41,310 –> 00:12:43,990
what we need to look
at is the reason why

269
00:12:44,110 –> 00:12:47,190
this is possible, why these
types of attacks happen.

270
00:12:47,310 –> 00:12:51,395
It’s taking a step back and
reassessing the current state

271
00:12:51,395 –> 00:12:53,195
within organizations,

272
00:12:53,235 –> 00:12:56,835
figuring out first of all what
the attack surface is in terms

273
00:12:56,835 –> 00:12:59,115
of identities across the org,

274
00:12:59,715 –> 00:13:02,035
doing some initial
cleanup as well,

275
00:13:02,035 –> 00:13:06,250
analysis into the existing
identities and accesses to

276
00:13:06,250 –> 00:13:08,730
very quickly pinpoint
ghost accounts,

277
00:13:08,730 –> 00:13:10,930
over privileged accesses,

278
00:13:11,450 –> 00:13:15,410
orphan accounts, illegitimate
accounts, things like that.

279
00:13:15,450 –> 00:13:16,650
And as you’re doing that,

280
00:13:16,650 –> 00:13:20,795
it’s about trying to establish
a risk level across the org.

281
00:13:20,795 –> 00:13:23,355
So, looking at all of these
accesses that are there,

282
00:13:23,355 –> 00:13:26,715
associating risk so that other
applications downstream can

283
00:13:26,715 –> 00:13:28,875
also leverage risk scores.

284
00:13:28,875 –> 00:13:33,035
For example, you might have accounts
that have legitimate access to a

285
00:13:33,035 –> 00:13:36,595
highly risky application that has
to do with financial transactions.

286
00:13:36,760 –> 00:13:39,160
A user might
require that access,

287
00:13:39,160 –> 00:13:41,480
but if you’re labeling
that properly,

288
00:13:41,480 –> 00:13:45,240
there are ways to then
through other applications like in the

289
00:13:45,240 –> 00:13:47,840
SOC, things like that,

290
00:13:48,040 –> 00:13:51,240
to look at the behavior there and be
able to lock things down more quickly.

291
00:13:51,240 –> 00:13:53,235
So, you do detect a
breach at that point,

292
00:13:53,235 –> 00:13:55,315
if you’re doing
responsive stuff,

293
00:13:55,315 –> 00:13:56,835
you can lock it down quicker.

294
00:13:56,835 –> 00:13:59,395
But also again, before
that breach happens,

295
00:13:59,395 –> 00:14:02,435
being able to reduce the level
of access that accounts have

296
00:14:02,435 –> 00:14:04,475
which they don’t
necessarily require.

297
00:14:05,760 –> 00:14:11,200
And so really kind of the steps to
move towards this are first of all,

298
00:14:11,200 –> 00:14:14,320
consistent data management
processes across the entirety

299
00:14:14,320 –> 00:14:15,200
of the organization.

300
00:14:15,200 –> 00:14:17,280
So that’s something that
needs to be put in place.

301
00:14:17,280 –> 00:14:19,880
So that’s making
sure that identities,

302
00:14:20,535 –> 00:14:23,815
the data around them is
aggregated into one place,

303
00:14:23,815 –> 00:14:26,775
maintained properly, everyone
has the right level of access.

304
00:14:26,775 –> 00:14:29,255
And then preventative
controls on top of that,

305
00:14:29,255 –> 00:14:32,935
which help us to quickly detect
and remediate any possible

306
00:14:32,935 –> 00:14:38,150
deviations to basic security
principles like least privilege access.

307
00:14:38,150 –> 00:14:40,630
And then finally, from there,

308
00:14:40,630 –> 00:14:44,310
using this aggregated data
and some of these controls in

309
00:14:44,310 –> 00:14:48,230
place, performing in-depth
assessments on accesses and

310
00:14:48,230 –> 00:14:51,110
security within the
organization in a continuous

311
00:14:51,110 –> 00:14:55,825
way by continuously ingesting
and observing any changes

312
00:14:55,825 –> 00:14:56,865
to accesses.

313
00:14:56,865 –> 00:14:59,585
So new accounts created,
new permissions allocated,

314
00:14:59,585 –> 00:15:00,985
things like that.

315
00:15:02,065 –> 00:15:05,480
I’ll move on to the next slide
and we’ll start to dig into how

316
00:15:05,480 –> 00:15:07,480
you can do this
in a concrete way.

317
00:15:07,480 –> 00:15:11,640
But first, we’ll talk about kind
of the pillars of zero trust.

318
00:15:11,640 –> 00:15:14,120
Do you want to talk
to this one, Wade?

319
00:15:14,120 –> 00:15:14,520
Yeah.

320
00:15:14,520 –> 00:15:15,800
And I think this again,

321
00:15:15,800 –> 00:15:17,640
bridging off what you
just said a moment ago,

322
00:15:17,640 –> 00:15:19,000
it sounds like observability.

323
00:15:19,000 –> 00:15:22,425
It sounds like visibility
into your identity data is a

324
00:15:22,425 –> 00:15:23,865
critical first step.

325
00:15:23,865 –> 00:15:26,105
You can’t manage
what you can’t see.

326
00:15:26,105 –> 00:15:29,545
And blind spots are what the
bad actors are exploiting,

327
00:15:29,545 –> 00:15:32,025
the areas that you haven’t
locked yourself down and

328
00:15:32,025 –> 00:15:33,120
cleaned things up.

329
00:15:33,120 –> 00:15:35,120
And because of the amount
of IT debt we have,

330
00:15:35,120 –> 00:15:37,200
there’s a lot of
work to do there.

331
00:15:37,200 –> 00:15:40,880
And really going back down to
the identities at the attribute

332
00:15:40,880 –> 00:15:42,960
level, what provisions
do they have,

333
00:15:42,960 –> 00:15:44,640
what have they been visioned to,

334
00:15:44,640 –> 00:15:46,080
what attributes do they have.

335
00:15:46,080 –> 00:15:48,600
Because when you start
doing zero trust,

336
00:15:48,795 –> 00:15:50,475
other way from the
right to the left,

337
00:15:50,475 –> 00:15:52,475
you’re starting on with
network access here,

338
00:15:52,475 –> 00:15:54,315
whether I can even
get on the network.

339
00:15:54,315 –> 00:15:57,195
Am I on a secure device? Am I
coming from a secure location?

340
00:15:57,195 –> 00:15:59,115
Am I a trusted user?

341
00:15:59,115 –> 00:16:02,075
And then where can I go within
the network in terms of network

342
00:16:02,075 –> 00:16:05,900
segmentation, controlling
flow and security there?

343
00:16:05,900 –> 00:16:09,900
Data access at the actual row
and column level with inside

344
00:16:09,900 –> 00:16:13,820
databases is now policy
controlled in a Zero Trust model.

345
00:16:13,820 –> 00:16:17,260
Application access
and application operation,

346
00:16:17,260 –> 00:16:20,505
left east to west operations
by applications within the

347
00:16:20,505 –> 00:16:24,105
environment themselves, service
accounts and those models,

348
00:16:24,105 –> 00:16:26,505
devices, and then all
the way up to access.

349
00:16:26,505 –> 00:16:30,225
My application access is moving
towards the Zero Trust model.

350
00:16:30,550 –> 00:16:33,670
Every one of those policy
decision points at each of

351
00:16:33,670 –> 00:16:35,830
those areas that says, Yes,

352
00:16:35,830 –> 00:16:38,550
this user can access
this resource,

353
00:16:38,550 –> 00:16:43,230
is evaluating identity data
attributes about that user

354
00:16:47,835 –> 00:16:49,275
against the policy.

355
00:16:49,275 –> 00:16:52,075
If the policy says it’s
okay, he gains access.

356
00:16:52,075 –> 00:16:55,115
If the policy says it’s
a violation, he doesn’t.

357
00:16:55,115 –> 00:16:58,115
It’s the identity data
that makes those decisions.

358
00:16:58,520 –> 00:17:02,440
And today, can you trust your identity
data one hundred percent to be

359
00:17:02,440 –> 00:17:05,800
accurate, to be complete,
to be uncorrupted by a bad

360
00:17:05,800 –> 00:17:09,320
operator, to be something you
trust to authorize all the

361
00:17:09,320 –> 00:17:13,280
access from the edge of your network
all the way to your applications?

362
00:17:13,515 –> 00:17:16,395
That’s something where I think we
have to focus our attention on.

363
00:17:16,395 –> 00:17:18,715
And that really kind of
highlights even more so why

364
00:17:18,715 –> 00:17:21,435
identities become more
and more critical here.

365
00:17:21,435 –> 00:17:22,835
Absolutely.

366
00:17:25,595 –> 00:17:26,955
To move on to the next slide.

367
00:17:26,955 –> 00:17:30,230
So we kind of established what
some of the challenges are

368
00:17:30,230 –> 00:17:33,590
today around identity
first security,

369
00:17:33,590 –> 00:17:35,790
especially around zero trust.

370
00:17:36,390 –> 00:17:40,470
Where identity shifts, which
is really as a foundation for

371
00:17:40,470 –> 00:17:42,785
all of the other decisions
that are being made within the

372
00:17:42,785 –> 00:17:44,865
organization from
a data perspective,

373
00:17:44,865 –> 00:17:46,305
from a security perspective.

374
00:17:46,305 –> 00:17:47,665
And so at its core,

375
00:17:47,665 –> 00:17:52,145
the identity data itself is
what needs to be cleaned up.

376
00:17:52,145 –> 00:17:55,825
Again, accesses need to be restricted
to the minimum necessary for

377
00:17:55,825 –> 00:17:59,010
users to be able to
accomplish their job.

378
00:17:59,450 –> 00:18:02,330
And there needs to be kind of
a proactive approach to making

379
00:18:02,330 –> 00:18:05,530
sure that the data stays
in a good state over time.

380
00:18:05,530 –> 00:18:08,250
And so the way that we approach
this with our customers,

381
00:18:08,250 –> 00:18:10,955
way that we’ve done this before
is through an approach that we

382
00:18:10,955 –> 00:18:12,555
call gets clean, stay clean.

383
00:18:12,555 –> 00:18:15,395
And then from there you
use that clean data.

384
00:18:15,835 –> 00:18:17,195
For the get clean piece,

385
00:18:17,195 –> 00:18:19,915
as we were saying a
couple of slides ago,

386
00:18:19,915 –> 00:18:23,035
the concept here is to
first gain full visibility,

387
00:18:23,035 –> 00:18:27,350
so full insight into the data that
exists across the organization.

388
00:18:27,350 –> 00:18:31,030
This is basically creating
a full inventory of user

389
00:18:31,030 –> 00:18:33,030
accounts, non human accounts,

390
00:18:33,030 –> 00:18:36,070
and entitlements that those
accounts can then access.

391
00:18:36,070 –> 00:18:38,630
From there, we focus
on staying clean.

392
00:18:38,630 –> 00:18:41,255
So once we create
this full catalog,

393
00:18:41,255 –> 00:18:45,255
the goal there is
to apply controls,

394
00:18:45,255 –> 00:18:47,335
fix any issues with accesses,

395
00:18:47,335 –> 00:18:51,895
so remove extraneous access,
over allocated privileges,

396
00:18:51,895 –> 00:18:55,930
identify and deactivate
things like orphan accounts

397
00:18:55,930 –> 00:18:59,130
or attach them to a user if
they still need to be there.

398
00:18:59,130 –> 00:19:03,130
Again, really reduce the scope
of accounts first of all and

399
00:19:03,130 –> 00:19:04,650
then the accesses
that they have.

400
00:19:04,650 –> 00:19:06,450
In order to stay clean,

401
00:19:06,505 –> 00:19:09,785
the idea there is to
continuously detect and ingest

402
00:19:09,785 –> 00:19:12,585
any changes that occur
to those identities.

403
00:19:12,585 –> 00:19:15,865
So like we were saying earlier,
newly created identities,

404
00:19:15,865 –> 00:19:18,425
newly created permissions
that are allocated to users,

405
00:19:18,425 –> 00:19:19,560
things like that.

406
00:19:19,560 –> 00:19:22,040
Then identify any
quality issues.

407
00:19:22,040 –> 00:19:25,720
So that would be identifying
the permissions that were added

408
00:19:25,720 –> 00:19:27,800
but shouldn’t be there,
things like that.

409
00:19:27,800 –> 00:19:30,680
And then finally take the
results of the identification

410
00:19:30,680 –> 00:19:33,225
in order to
automatically remediate.

411
00:19:33,225 –> 00:19:37,065
So either revoke accesses
or validate them and

412
00:19:37,065 –> 00:19:41,065
verify them with
either line managers,

413
00:19:41,065 –> 00:19:44,945
application managers who know that
that access should be there in fact.

414
00:19:45,065 –> 00:19:47,785
And then from there it’s
about enforcing regular access

415
00:19:47,785 –> 00:19:52,200
reviews to validate
the legitimacy of those

416
00:19:52,200 –> 00:19:53,960
permissions that
have been added.

417
00:19:53,960 –> 00:19:55,720
This is, you know,

418
00:19:55,720 –> 00:19:59,160
from kind of the security into
the compliance piece almost

419
00:19:59,160 –> 00:20:02,600
where you want to make
sure and verify consciously

420
00:20:02,600 –> 00:20:05,335
with your managers that the

421
00:20:05,335 –> 00:20:09,695
accesses that are there are indeed
good and shouldn’t be revoked.

422
00:20:09,815 –> 00:20:10,855
And then finally,

423
00:20:10,855 –> 00:20:14,215
allowing applications to
leverage that data in order to

424
00:20:14,215 –> 00:20:16,215
make just in time
access decisions.

425
00:20:16,215 –> 00:20:19,630
And this is what moves you
towards kind of the Zero Trust

426
00:20:19,630 –> 00:20:20,990
security model.

427
00:20:20,990 –> 00:20:24,190
The idea being to have
this clean accurate data,

428
00:20:24,190 –> 00:20:26,670
having a mechanism through
which you can guarantee that it

429
00:20:26,670 –> 00:20:28,270
stays clean and accurate,

430
00:20:28,270 –> 00:20:30,910
and then being able to
present it to your downstream

431
00:20:30,910 –> 00:20:34,245
applications so that they can
make policy based decisions in

432
00:20:34,245 –> 00:20:37,885
real time as effectively
and as quickly as possible.

433
00:20:38,725 –> 00:20:39,205
Excellent.

434
00:20:39,205 –> 00:20:41,845
And it seems to me there’s
three major areas here that

435
00:20:41,845 –> 00:20:44,970
necessary functionally
for this to operate.

436
00:20:44,970 –> 00:20:46,890
One is the introduction of AI,

437
00:20:46,890 –> 00:20:48,490
because the idea
of getting clean,

438
00:20:48,490 –> 00:20:52,090
the sheer scale of the number
of identities multiplied by the

439
00:20:52,090 –> 00:20:54,730
number of attributes across
the different platforms and

440
00:20:54,730 –> 00:20:57,250
different formats and
different identifiers,

441
00:20:57,295 –> 00:21:00,335
that is more than the human
mind can put their head around.

442
00:21:00,335 –> 00:21:03,055
If you’re a large organization
with thousands or tens of

443
00:21:03,055 –> 00:21:05,775
thousands or hundreds of
thousands of employees,

444
00:21:05,775 –> 00:21:08,255
you have a massive amount
of data to go through.

445
00:21:08,255 –> 00:21:10,815
AI is built for
that kind of idea,

446
00:21:10,815 –> 00:21:12,640
where I’m looking at
massive amounts of data,

447
00:21:12,640 –> 00:21:14,720
and I’m trying to
find commonality.

448
00:21:14,720 –> 00:21:16,160
I’m trying to find anomalies.

449
00:21:16,160 –> 00:21:18,240
I’m looking at the data
for patterns and pattern

450
00:21:18,240 –> 00:21:20,960
recognition, and I
learn as I go along.

451
00:21:20,960 –> 00:21:24,720
So implementing an AI in this
platform has a number of places

452
00:21:24,720 –> 00:21:26,675
where it has a tremendous value.

453
00:21:26,675 –> 00:21:31,315
The access review, the Stay Clean
capability is also AI driven,

454
00:21:31,315 –> 00:21:34,595
where I can build context
around my review process so

455
00:21:34,595 –> 00:21:38,115
that the human reviewer has
more information to make a more

456
00:21:38,115 –> 00:21:39,635
valid decision.

457
00:21:39,635 –> 00:21:42,340
Because the worst thing you can
do is clean up your environment

458
00:21:42,340 –> 00:21:45,860
and then go through some rubber
stamping exercises to verify

459
00:21:45,860 –> 00:21:49,620
that it’s staying clean as it
slowly gets dirtier and dirtier

460
00:21:49,620 –> 00:21:52,180
and entropy introduces
into the system.

461
00:21:52,180 –> 00:21:55,500
So you want to make sure
you’re managing it end to end.

462
00:21:55,605 –> 00:21:58,725
And then the ability to
recognize change in real time,

463
00:21:58,725 –> 00:22:00,085
because as you
mentioned earlier,

464
00:22:00,085 –> 00:22:03,365
that real time nature is
critical to be able to

465
00:22:03,365 –> 00:22:07,365
recognize an attempt to change
information in a way that

466
00:22:07,365 –> 00:22:10,640
compromises an account,
potentially by a bad operator,

467
00:22:10,640 –> 00:22:15,080
potentially by an inadvertent
activity or something out of band.

468
00:22:15,280 –> 00:22:18,800
Historically, admins go in and make
changes because that’s the way they

469
00:22:18,800 –> 00:22:20,640
operated for twenty years.

470
00:22:20,640 –> 00:22:22,960
But if you’ve got a
controlled lockdown system,

471
00:22:22,960 –> 00:22:26,385
changes should only come from
authorized sources of truth,

472
00:22:26,385 –> 00:22:28,785
not just from anyone
arbitrarily making changes.

473
00:22:28,785 –> 00:22:32,145
So you have to recognize those
and back them out in real time.

474
00:22:32,145 –> 00:22:35,825
And then the use clean piece
is the capability of delivering

475
00:22:35,825 –> 00:22:37,905
this data to other applications.

476
00:22:37,905 –> 00:22:41,865
A lot of our components
in the Identity Data Management stack

477
00:22:41,870 –> 00:22:43,550
function solely for themselves.

478
00:22:43,550 –> 00:22:45,150
They gather identity together,

479
00:22:45,150 –> 00:22:47,470
but they do it just to do
provisioning activities.

480
00:22:47,470 –> 00:22:50,270
They don’t make that data
available for authorization.

481
00:22:50,270 –> 00:22:52,510
Or they do it just
for PAM operations,

482
00:22:52,510 –> 00:22:55,150
but they don’t make it
available for Zero Trust.

483
00:22:55,150 –> 00:22:58,305
The ability to make this
information that’s now pristine

484
00:22:58,305 –> 00:23:02,465
and ideal and a source of
truth available for all the different

485
00:23:02,465 –> 00:23:05,745
applications that need to
consume identity data equally

486
00:23:05,745 –> 00:23:08,705
is critical so that you’re
working off the same set of

487
00:23:08,705 –> 00:23:11,490
information when you’re
authorizing access with your

488
00:23:11,490 –> 00:23:15,370
SAML token, when you’re
authorizing permissions to,

489
00:23:15,410 –> 00:23:18,290
to onboard a user with certain
access rights when you’re

490
00:23:18,290 –> 00:23:22,130
giving someone the zero trust
permission to access a resource.

491
00:23:22,130 –> 00:23:24,850
You want to do this off the
same clean data you have.

492
00:23:24,850 –> 00:23:27,145
So you have one place to
audit, one place to manage,

493
00:23:27,145 –> 00:23:28,985
and one place to trust.

494
00:23:28,985 –> 00:23:30,465
Absolutely.

495
00:23:30,905 –> 00:23:34,905
And so the next thing we can
look at are a few ways in which

496
00:23:34,905 –> 00:23:37,225
we can approach this challenge.

497
00:23:37,225 –> 00:23:39,400
How do we actually get clean,

498
00:23:39,400 –> 00:23:43,200
stay clean and how are we
actually able to use that data?

499
00:23:43,640 –> 00:23:47,000
So we have three main things that
we wanted to look at for this.

500
00:23:47,000 –> 00:23:50,880
The first one is around
creating a single pane of glass.

501
00:23:51,425 –> 00:23:54,465
The idea here is that if
you don’t have a single place to

502
00:23:54,465 –> 00:23:57,425
see all these accesses,
see all these accounts,

503
00:23:57,425 –> 00:24:01,585
how can you go about
the task of actually

504
00:24:01,585 –> 00:24:05,785
being able to clean things up
and reduce accesses, right?

505
00:24:05,810 –> 00:24:08,930
And so this is kind of the
core piece here that’s the

506
00:24:08,930 –> 00:24:12,530
fundamental most important
upfront part to do is creating

507
00:24:12,530 –> 00:24:13,650
the single pane of glass.

508
00:24:13,650 –> 00:24:15,890
Do you want comment
on this one, Wade?

509
00:24:15,890 –> 00:24:18,690
Yeah, just along the effect
of what we said a moment ago,

510
00:24:18,690 –> 00:24:20,695
that it’s critical that
everyone’s working from the

511
00:24:20,695 –> 00:24:21,975
same piece of sheet music.

512
00:24:21,975 –> 00:24:24,055
If you’re trying to have
a symphony that’s playing

513
00:24:24,055 –> 00:24:26,535
together and making
harmonious sounds,

514
00:24:26,535 –> 00:24:28,615
they’ve all got to be
using the same information.

515
00:24:28,615 –> 00:24:31,015
They all have a little bit
different way that the notes

516
00:24:31,015 –> 00:24:32,615
are constructed for them,

517
00:24:32,615 –> 00:24:34,735
but they’re playing
the same symphony.

518
00:24:35,310 –> 00:24:38,990
And that’s critical because if
you let each section in your

519
00:24:38,990 –> 00:24:43,230
orchestra write their
own interpretation of

520
00:24:43,230 –> 00:24:46,430
a symphony, you’re going
to get a bunch of crazy,

521
00:24:46,430 –> 00:24:50,295
very uncomfortable
noise and not music.

522
00:24:50,295 –> 00:24:53,015
And this is the critical nature
of identity management is we

523
00:24:53,015 –> 00:24:56,455
need to really start to understand
this is a holistic model.

524
00:24:56,455 –> 00:25:00,775
This is a complete
posture across the full

525
00:25:00,775 –> 00:25:03,690
span of my identity environment
that we’re trying to manage.

526
00:25:03,690 –> 00:25:07,130
And that starts with everyone
using the same information to

527
00:25:07,130 –> 00:25:09,890
make the decisions that
they make at their level.

528
00:25:10,970 –> 00:25:12,890
Absolutely. Yep.

529
00:25:12,890 –> 00:25:15,210
And part of this, right,

530
00:25:15,210 –> 00:25:19,090
the idea here is to end up with a single
pane of glass again for visibility.

531
00:25:19,125 –> 00:25:21,685
But this also helps
to potentially,

532
00:25:21,685 –> 00:25:24,485
if you have a single source
of truth as a result of this,

533
00:25:24,485 –> 00:25:28,165
reduce the number of credential checks
that have to happen within the org.

534
00:25:28,165 –> 00:25:32,645
So fewer needs around
password duplication and

535
00:25:32,645 –> 00:25:37,400
synchronization which inherently
reduces your attack surface there.

536
00:25:37,520 –> 00:25:43,080
Also through centralizing
authentication and authorization,

537
00:25:43,120 –> 00:25:47,000
you have again one place to control
those authorizations downstream.

538
00:25:47,245 –> 00:25:50,685
This also helps to simplify
audits and reduce risk on the

539
00:25:50,685 –> 00:25:51,885
compliance side.

540
00:25:51,885 –> 00:25:54,685
If you can very quickly
understand what’s going on and

541
00:25:54,685 –> 00:25:56,925
be able to audit and track that,

542
00:25:56,925 –> 00:26:00,125
you can very quickly spit out
reports in order to provide to

543
00:26:00,125 –> 00:26:04,140
auditors and provide proof that
you are in a compliant state.

544
00:26:04,140 –> 00:26:07,980
And then finally, again,
if you have trusted data,

545
00:26:07,980 –> 00:26:10,620
if you know that it’s in
a single place and clean,

546
00:26:10,620 –> 00:26:13,900
from there you can easily
automate a lot of tasks with

547
00:26:13,900 –> 00:26:17,915
the confidence that those
tasks will be executed

548
00:26:17,915 –> 00:26:19,755
on that clean data.

549
00:26:19,755 –> 00:26:24,995
And so that the result of what they
produce will be a clean result as well.

550
00:26:25,195 –> 00:26:28,635
And I think anyone that’s deployed
an identity management platform,

551
00:26:28,635 –> 00:26:30,930
whether it’s a governance
platform or provisioning

552
00:26:30,930 –> 00:26:32,930
platform, single sign
on, a PAM platform,

553
00:26:32,930 –> 00:26:36,930
understands what a heavy lift it is
to get all the data into the system.

554
00:26:36,930 –> 00:26:39,330
Like the first big hurdle you
have to get over when you’re

555
00:26:39,330 –> 00:26:40,850
deploying a platform.

556
00:26:40,850 –> 00:26:43,970
If you can do this once and do
it correctly and then make that

557
00:26:43,970 –> 00:26:46,545
data easily available to
every system that needs it,

558
00:26:46,545 –> 00:26:49,905
especially as you start to
roll out more and more policy

559
00:26:49,905 –> 00:26:52,945
decision points closer and
closer to the resources that

560
00:26:52,945 –> 00:26:54,385
you’re protecting.

561
00:26:54,385 –> 00:26:56,865
You don’t want to go
through that process of reconnecting

562
00:26:56,865 –> 00:27:00,065
and aggregating and normalizing
and try to clean the data

563
00:27:00,065 –> 00:27:02,690
seven, eight, nine times
across your organization.

564
00:27:02,690 –> 00:27:06,050
This is where you should focus
your effort once and then reuse

565
00:27:06,050 –> 00:27:07,250
this as much as possible.

566
00:27:07,250 –> 00:27:09,330
That’s a tremendous
boost in efficiency.

567
00:27:09,330 –> 00:27:12,850
In fact, Gartner indicated
you’d double the ROI on your

568
00:27:12,850 –> 00:27:16,875
IGA deployment if you did the
data hygiene, data cleanup,

569
00:27:16,875 –> 00:27:20,315
single source of identity data
upfront before you started to

570
00:27:20,315 –> 00:27:21,515
embark on that project.

571
00:27:21,515 –> 00:27:25,795
So it’s definitely something to invest
in because the dividends are tremendous.

572
00:27:25,915 –> 00:27:26,715
Absolutely.

573
00:27:26,715 –> 00:27:30,755
And closely related to this
is kind of a second task.

574
00:27:31,100 –> 00:27:34,540
As you’re going through the
task of creating this single

575
00:27:34,540 –> 00:27:36,620
source of truth, a
single pane of glass,

576
00:27:36,620 –> 00:27:40,140
what becomes apparent very
quickly is that there are most

577
00:27:40,140 –> 00:27:43,060
likely a large number of
sources within an organization,

578
00:27:43,175 –> 00:27:48,095
a large number of systems which
contain identity related information.

579
00:27:48,295 –> 00:27:51,015
And traditionally these
would be directories,

580
00:27:51,015 –> 00:27:53,175
these could also be
cloud repositories,

581
00:27:53,175 –> 00:27:56,770
but creating this single pane
of glass highlights a need

582
00:27:56,770 –> 00:27:59,810
around potentially
consolidating and modernizing a

583
00:27:59,810 –> 00:28:01,530
lot of this infrastructure.

584
00:28:01,570 –> 00:28:05,250
There’s two main
challenges to this task.

585
00:28:05,250 –> 00:28:06,930
One is technical in nature,

586
00:28:06,930 –> 00:28:09,690
the other one is more
qualitative in nature.

587
00:28:09,895 –> 00:28:11,495
On the technical side,

588
00:28:11,495 –> 00:28:14,135
when we talk about
infrastructure consolidation,

589
00:28:14,135 –> 00:28:17,815
what we’re talking about really is
cleaning up both on prem and

590
00:28:17,815 –> 00:28:21,095
cloud resources to reduce
the need to sync fragments of

591
00:28:21,095 –> 00:28:23,855
identity across these
different places.

592
00:28:23,860 –> 00:28:27,740
It also has to do
with decommissioning

593
00:28:27,940 –> 00:28:31,940
and performing modernization that’s
a lot of the time long overdue.

594
00:28:31,940 –> 00:28:33,060
So for example,

595
00:28:33,060 –> 00:28:37,260
any out of support legacy directories
need to be replaced anyway.

596
00:28:37,395 –> 00:28:40,595
Other types of identity
access management solutions,

597
00:28:40,595 –> 00:28:41,955
MIM is a great example, right?

598
00:28:41,955 –> 00:28:44,515
It’s been end of life
for the past ten years.

599
00:28:44,515 –> 00:28:46,995
It’s end of life in
twenty twenty nine.

600
00:28:46,995 –> 00:28:48,115
So they’re getting there.

601
00:28:48,115 –> 00:28:49,395
But again,

602
00:28:49,395 –> 00:28:53,195
there’s a need to replace a lot
of these tools that are in place.

603
00:28:53,710 –> 00:28:57,950
And part of a zero
trust modernization

604
00:28:57,950 –> 00:29:01,550
project can be to actually clean
up a lot of this tech debt.

605
00:29:01,550 –> 00:29:04,110
The other one is
qualitative in nature.

606
00:29:04,110 –> 00:29:06,505
What I mean by that
is it’s kind of

607
00:29:06,505 –> 00:29:09,865
the deeper understanding
beyond just the data itself,

608
00:29:09,865 –> 00:29:13,065
it’s understanding what the
data means and the accesses

609
00:29:13,065 –> 00:29:14,745
that it provides.

610
00:29:14,745 –> 00:29:17,705
So this is everything around
things like orphan account

611
00:29:17,705 –> 00:29:20,320
removal, streamlining
or realigning

612
00:29:20,320 –> 00:29:23,040
access, potentially changing
your access model, right?

613
00:29:23,040 –> 00:29:26,280
As you shift towards a
policy based access model,

614
00:29:26,480 –> 00:29:30,655
you still have roles in place that
define kind of a base level access,

615
00:29:30,655 –> 00:29:34,335
additional policies based off
of attributes that look into

616
00:29:34,335 –> 00:29:36,135
additional conditions.

617
00:29:36,175 –> 00:29:38,175
But in order to manage
this more effectively,

618
00:29:38,175 –> 00:29:40,495
you might look at also
potentially changing your role

619
00:29:40,495 –> 00:29:42,175
model behind the scenes, right?

620
00:29:42,175 –> 00:29:45,750
So there’s a lot of
cleanup that can be

621
00:29:45,750 –> 00:29:47,030
done here, that can go on,

622
00:29:47,030 –> 00:29:50,230
and a lot of streamlining as
part of this modernization and

623
00:29:50,230 –> 00:29:52,030
Zero Trust deployment.

624
00:29:52,550 –> 00:29:55,750
And I think what it’s
critical to recognize here is

625
00:29:55,750 –> 00:29:58,935
this is something that is an
industry we finally started

626
00:29:58,935 –> 00:30:02,775
admitting to our customers a
few years ago with Zero Trust,

627
00:30:02,775 –> 00:30:04,135
that this is a journey.

628
00:30:04,135 –> 00:30:06,615
This is not one product.
This is not one project.

629
00:30:06,615 –> 00:30:10,615
This is an effort you
undertake to continuously chip

630
00:30:10,615 –> 00:30:13,240
away at this iceberg.

631
00:30:13,240 –> 00:30:15,960
There’s a lot of tech debt
in most organizations.

632
00:30:15,960 –> 00:30:18,200
You want to focus on the
low hanging fruit early.

633
00:30:18,200 –> 00:30:20,440
You want to focus on
the high risk early,

634
00:30:20,440 –> 00:30:23,555
but you want to put in
processes that help you sort of

635
00:30:23,555 –> 00:30:27,075
move down the line and eat that
elephant one bite at a time,

636
00:30:27,075 –> 00:30:29,075
but get it all consumed.

637
00:30:29,075 –> 00:30:32,115
Equally critical is that you
have to put in, as JR has said,

638
00:30:32,115 –> 00:30:34,275
you have to put in measures
to maintain that data,

639
00:30:34,275 –> 00:30:35,075
to keep it clean,

640
00:30:35,075 –> 00:30:38,040
because otherwise you’re going
to be in a continuous loop of

641
00:30:38,040 –> 00:30:41,400
cleaning information that gets
dirty as soon as you let go of it.

642
00:30:41,400 –> 00:30:44,280
And what’s critical here is to
recognize also that we’re not

643
00:30:44,280 –> 00:30:46,920
talking about just cleaning
up the data in the master user

644
00:30:46,920 –> 00:30:48,840
record in the unified data.

645
00:30:48,840 –> 00:30:51,080
We’re talking about writing
back to the sources and

646
00:30:51,080 –> 00:30:54,625
remediating those errors in
the original sources of truth.

647
00:30:54,625 –> 00:30:56,625
Because there’s still going
to be applications in your

648
00:30:56,625 –> 00:30:59,585
environment no matter how hard
you work that are still talking

649
00:30:59,585 –> 00:31:02,705
to original data sources
that aren’t able to redirect

650
00:31:02,705 –> 00:31:05,985
themselves to a federated
access or to a Zero Trust model.

651
00:31:05,985 –> 00:31:08,590
So you need to make sure that
the data, everywhere it exists,

652
00:31:08,590 –> 00:31:09,870
wherever it’s been synchronized,

653
00:31:09,870 –> 00:31:11,710
wherever it’s been distributed,

654
00:31:11,710 –> 00:31:13,470
as much as it still needs to be,

655
00:31:13,470 –> 00:31:16,430
that data picks up
the cleanup changes.

656
00:31:16,430 –> 00:31:19,990
That data reflects the quality of
the data in your master record.

657
00:31:20,535 –> 00:31:21,255
Absolutely.

658
00:31:21,255 –> 00:31:23,255
And there’s something that you
alluded to in there that leads

659
00:31:23,255 –> 00:31:25,495
to our last point here, Wade.

660
00:31:25,495 –> 00:31:29,215
And then this is our second
to last slide for everyone on.

661
00:31:29,335 –> 00:31:32,375
It’s the idea of taking
reviews that are performed for

662
00:31:32,375 –> 00:31:35,450
compliance purposes and then
turning them to living controls.

663
00:31:35,450 –> 00:31:36,410
As you said Wade,

664
00:31:36,410 –> 00:31:38,410
a lot of the time people
perform a cleanup but

665
00:31:38,410 –> 00:31:40,650
immediately once
that cleanup is done,

666
00:31:40,650 –> 00:31:44,010
it’s no longer effective
because the second after you’re

667
00:31:44,010 –> 00:31:46,730
done saying that
something is valid,

668
00:31:46,730 –> 00:31:49,545
a condition changes and
suddenly it’s no longer valid.

669
00:31:49,545 –> 00:31:51,865
And this is especially the
case when you’re talking about

670
00:31:51,865 –> 00:31:53,865
compliance reviews or audits.

671
00:31:53,865 –> 00:31:57,865
A lot of the time this is a
once or twice a year task where

672
00:31:57,865 –> 00:31:59,785
the audit team goes through,

673
00:31:59,785 –> 00:32:02,185
creates a list of suggestions
and improvements that need to

674
00:32:02,185 –> 00:32:05,140
happen in order to make
sure that the organization

675
00:32:05,140 –> 00:32:06,660
is in a compliant state.

676
00:32:06,660 –> 00:32:10,100
Those changes are applied but
immediately afterwards, right?

677
00:32:10,100 –> 00:32:13,060
Things have changed even since
the audit was finished and

678
00:32:13,060 –> 00:32:16,420
those changes were applied
and it’s no longer valid.

679
00:32:16,420 –> 00:32:19,345
And now you have six months
potentially or a year until the

680
00:32:19,345 –> 00:32:21,705
next time you
perform this kind of

681
00:32:21,985 –> 00:32:25,545
recertification
almost on the org.

682
00:32:26,865 –> 00:32:28,945
Hey, go ahead Wade. Do you
have comments on this one?

683
00:32:28,945 –> 00:32:30,145
Otherwise I can keep going.

684
00:32:30,145 –> 00:32:32,120
Yeah, just again to reiterate,

685
00:32:32,120 –> 00:32:35,080
I think we started at the very
beginning explaining how fast a

686
00:32:35,080 –> 00:32:37,560
bad operator can move
in your environment.

687
00:32:37,560 –> 00:32:41,720
So once something is being
altered in a real time basis,

688
00:32:41,720 –> 00:32:44,280
you need to be able to
have policies in place that will

689
00:32:44,280 –> 00:32:47,755
recognize that in real time
that can either block it or

690
00:32:47,755 –> 00:32:49,595
alert on it or remediate it.

691
00:32:49,595 –> 00:32:50,475
Because a lot of time,

692
00:32:50,475 –> 00:32:53,195
ITDR systems are looking for
behavioral patterns or looking

693
00:32:53,195 –> 00:32:54,075
for network traffic.

694
00:32:54,075 –> 00:32:57,195
They’re not looking at the
identity data itself that a bad

695
00:32:57,195 –> 00:33:00,320
operator may be manipulating
to escalate his own privileges

696
00:33:00,320 –> 00:33:02,240
to move within the organization.

697
00:33:02,240 –> 00:33:04,880
So, need to be watching the front
door and the back door at the

698
00:33:04,880 –> 00:33:07,680
same time and the closer
you can get to real time,

699
00:33:07,680 –> 00:33:10,080
the closer you can
get to preventing something from

700
00:33:10,080 –> 00:33:13,320
getting bad or getting worse
if it if it is starting

701
00:33:13,455 –> 00:33:16,015
And that, again, becomes
critical, as you said,

702
00:33:16,015 –> 00:33:20,415
that the ability to
maintain that data is the

703
00:33:20,415 –> 00:33:23,935
insurance you’re buying on top
of all the work you’re doing.

704
00:33:23,935 –> 00:33:26,415
It’s like you go to an
appliance store and you buy a

705
00:33:26,415 –> 00:33:29,620
brand new oven, and they offer
you a three year warranty.

706
00:33:29,620 –> 00:33:32,260
This stay clean methodology
is your warranty.

707
00:33:32,260 –> 00:33:35,060
Is all the work I put in to
clean up going to be valuable?

708
00:33:35,060 –> 00:33:38,660
Yes, because I’m maintaining
that and it’s a lifestyle.

709
00:33:38,660 –> 00:33:40,740
Once you’ve implemented
these processes,

710
00:33:40,740 –> 00:33:42,655
once you have these
systems in place,

711
00:33:42,655 –> 00:33:46,495
they run automatically
and help you stay clean.

712
00:33:46,495 –> 00:33:49,215
But you have to take the extra
step of going all the way out

713
00:33:49,215 –> 00:33:50,895
to doing that, too.

714
00:33:50,895 –> 00:33:51,855
Absolutely.

715
00:33:51,855 –> 00:33:54,910
And the traditional approach, one
way of doing this keeping

716
00:33:54,910 –> 00:33:57,950
things clean is through,
especially around reviews,

717
00:33:57,950 –> 00:34:00,830
is the concept of micro
recertification campaigns.

718
00:34:00,830 –> 00:34:04,750
The idea there is using
a piece of software that

719
00:34:04,750 –> 00:34:08,885
will look at the changes that have
occurred between review periods.

720
00:34:08,885 –> 00:34:11,925
And this could be potentially
daily snapshots or even real

721
00:34:11,925 –> 00:34:14,805
time increasingly when we’re
talking about identity security

722
00:34:14,805 –> 00:34:16,085
posture management.

723
00:34:16,085 –> 00:34:19,365
The idea is to have a tool that
will ingest any change that

724
00:34:19,365 –> 00:34:23,125
occurs and immediately evaluate
that against the previously

725
00:34:23,125 –> 00:34:26,690
known state, previously
known clean state.

726
00:34:26,690 –> 00:34:29,890
If there’s a new access
that’s anomalous in some way,

727
00:34:29,890 –> 00:34:33,490
it either alerts users so that
they can come verify and check

728
00:34:33,490 –> 00:34:35,970
off and say that yes,
this is indeed expected.

729
00:34:35,970 –> 00:34:37,010
In other scenarios,

730
00:34:37,010 –> 00:34:40,545
it might prompt a larger access
review where you go through a

731
00:34:40,545 –> 00:34:44,545
batch of once a week
potentially a few changes that

732
00:34:44,545 –> 00:34:47,505
occurred, maybe ten
of the riskiest ones.

733
00:34:47,505 –> 00:34:50,945
But again, the idea is to have
some software in place that allows

734
00:34:50,945 –> 00:34:54,070
you to very quickly pick up
on those changes and then turn

735
00:34:54,070 –> 00:34:56,470
these reviews into more
frequent micro recertification

736
00:34:56,470 –> 00:35:00,150
reviews to be able
to in a much less

737
00:35:00,150 –> 00:35:04,390
painful way for users and
administrators guarantee that

738
00:35:04,390 –> 00:35:07,150
you’ll stay in a good
clean state over time.

739
00:35:07,695 –> 00:35:10,575
And so this leads us to the
last slide here which is

740
00:35:10,575 –> 00:35:13,535
what kind of timeline are
you looking at when you’re

741
00:35:13,535 –> 00:35:16,935
performing this kind of
get clean, stay clean approach?

742
00:35:17,695 –> 00:35:21,055
So the idea here is to look
at this in terms of short,

743
00:35:21,055 –> 00:35:23,095
medium and long term.

744
00:35:23,440 –> 00:35:25,760
In terms of the
short term, right,

745
00:35:25,760 –> 00:35:29,840
within the first month of this
kind of modernization project

746
00:35:29,840 –> 00:35:31,520
moving towards Zero Trust,

747
00:35:31,520 –> 00:35:33,680
what you want to do is first
map out all the different

748
00:35:33,680 –> 00:35:36,240
systems within your enterprise
that manage identities.

749
00:35:36,240 –> 00:35:38,415
So this is identifying
the different silos,

750
00:35:38,415 –> 00:35:43,135
repositories where identities
reside, performing some initial

751
00:35:43,135 –> 00:35:46,655
analysis to assess the
quality of that data.

752
00:35:46,655 –> 00:35:49,535
So this has to do with
identifying some of those

753
00:35:49,535 –> 00:35:50,735
orphan accounts,

754
00:35:50,735 –> 00:35:52,575
making sure that you can
attach those to people,

755
00:35:52,575 –> 00:35:53,400
things like that.

756
00:35:53,400 –> 00:35:55,880
And then starting to define
your master user record,

757
00:35:55,880 –> 00:35:57,720
which will be the one
place that you go,

758
00:35:57,720 –> 00:36:01,480
where all the applications go
to gain access to data that

759
00:36:01,480 –> 00:36:03,080
they need for authorization,

760
00:36:03,080 –> 00:36:05,400
as well as the single place
that you’ll be able to perform

761
00:36:05,400 –> 00:36:06,520
analytics on, right?

762
00:36:06,520 –> 00:36:09,375
As changes come in from
the backend sources, again,

763
00:36:09,375 –> 00:36:11,935
new accounts created,
deleted, etcetera.

764
00:36:11,935 –> 00:36:15,455
Having this master user record
will be the foundation for the

765
00:36:15,455 –> 00:36:17,575
analysis that you perform there.

766
00:36:17,695 –> 00:36:21,615
Medium term, kind of the
three month idea there is to

767
00:36:21,615 –> 00:36:24,660
start cleaning and
consolidating the data itself.

768
00:36:24,660 –> 00:36:28,500
So bringing it into
one directory or fewer

769
00:36:28,500 –> 00:36:30,700
directories than exist today.

770
00:36:30,740 –> 00:36:33,300
Part of this is if you’re
looking at moving some of the

771
00:36:33,300 –> 00:36:34,660
data to the cloud as well,

772
00:36:34,660 –> 00:36:36,620
consolidating down
to one tenant.

773
00:36:37,455 –> 00:36:41,295
Also, going through the process
to start triggering access reviews

774
00:36:41,295 –> 00:36:44,255
on key events on a
more regular basis.

775
00:36:44,255 –> 00:36:48,335
So again, risky accesses that are
granted to users, new privileges,

776
00:36:48,335 –> 00:36:51,690
things like that should trigger
a micro recertification or a

777
00:36:51,690 –> 00:36:54,490
micro review so that somebody
signs off on a change that

778
00:36:54,490 –> 00:36:57,530
happens to make sure
that it’s good to go.

779
00:36:57,530 –> 00:37:00,090
And then from there designing
and implementing some of the

780
00:37:00,090 –> 00:37:03,530
governance policies and some of
the policy based access control

781
00:37:03,530 –> 00:37:05,745
that will be in
the final product.

782
00:37:05,745 –> 00:37:07,825
And then finally, within
the twelve month span,

783
00:37:07,825 –> 00:37:10,705
it’s working on the maturity of
the model and the deployment.

784
00:37:10,705 –> 00:37:14,225
The idea here being to
automate reviews and detect

785
00:37:14,225 –> 00:37:16,545
anomalies in real
time as they occur,

786
00:37:16,545 –> 00:37:19,020
but also to integrate
the results.

787
00:37:19,020 –> 00:37:21,820
So this master user record that
we started defining within the

788
00:37:21,820 –> 00:37:25,580
first month with other applications
within the organization.

789
00:37:25,580 –> 00:37:29,340
So enterprise risk management
applications, things like that.

790
00:37:29,340 –> 00:37:32,555
It’s basically taking all
this work that’s done to unify

791
00:37:32,555 –> 00:37:36,075
data, associate risk
scores with it, etcetera,

792
00:37:36,075 –> 00:37:39,755
etcetera and allow other
security focused applications

793
00:37:39,755 –> 00:37:43,515
either within the identity
team or the SOC team or the

794
00:37:43,515 –> 00:37:46,795
security team to take that data
and then make better decisions

795
00:37:46,795 –> 00:37:50,650
and flag anomalous behavior,
things like that very quickly.

796
00:37:50,650 –> 00:37:52,290
And then finally,

797
00:37:52,810 –> 00:37:56,970
having a system in place,
so dashboards and analytics,

798
00:37:56,970 –> 00:38:01,010
continuous controls that are
applied in order to observe

799
00:38:01,450 –> 00:38:02,455
the security

800
00:38:02,455 –> 00:38:06,335
posture, the identity
data security posture,

801
00:38:06,535 –> 00:38:10,775
and its evolution over time with
real time changes taken into account.

802
00:38:10,775 –> 00:38:14,095
Wade, anything to add
to this? Any comments?

803
00:38:14,210 –> 00:38:16,290
Yeah, I just want to highlight
a couple of things here.

804
00:38:16,290 –> 00:38:18,770
One, again, restating
that this is a journey.

805
00:38:18,770 –> 00:38:21,970
As you can see it kind of
illustrated here in the curve

806
00:38:21,970 –> 00:38:24,930
and the dots and the timeline.

807
00:38:24,930 –> 00:38:26,235
So this is a process.

808
00:38:26,235 –> 00:38:28,875
And you want to make sure you do
the process in the right order.

809
00:38:28,875 –> 00:38:30,555
If you’re going on a trip,

810
00:38:30,555 –> 00:38:32,635
if you’re going from
Los Angeles to New York,

811
00:38:32,635 –> 00:38:34,475
or you’re going from
London to Paris,

812
00:38:34,475 –> 00:38:37,355
there are things you do first
and there’s things you do later.

813
00:38:37,355 –> 00:38:41,810
You don’t pick up your
bags at the baggage line

814
00:38:41,810 –> 00:38:45,170
in Paris before you
take off in London.

815
00:38:45,170 –> 00:38:47,810
So you need to make sure you’re
doing things in the right order here.

816
00:38:47,810 –> 00:38:50,130
Almost anomalous to
building a house.

817
00:38:50,130 –> 00:38:53,975
I have a lot of things to do
consideration in building a house.

818
00:38:53,975 –> 00:38:56,535
I have a lot of different
components in my identity stack

819
00:38:56,535 –> 00:38:58,695
that I need to pay attention to:

820
00:38:58,695 –> 00:39:02,055
heating and air conditioning,
my windows, my kitchen,

821
00:39:02,055 –> 00:39:03,975
the number of bathrooms I have.

822
00:39:03,975 –> 00:39:06,100
But all this is built
on a foundation.

823
00:39:06,100 –> 00:39:08,020
That foundation
is identity data.

824
00:39:08,020 –> 00:39:10,500
And if you don’t have
that foundation correct,

825
00:39:10,500 –> 00:39:13,060
you don’t have it sized properly
for the house you’re building,

826
00:39:13,060 –> 00:39:15,860
you don’t have it strong enough
to hold the second floor,

827
00:39:15,860 –> 00:39:20,055
then everything else you do
later is not going to stand up

828
00:39:20,055 –> 00:39:23,495
over the test of time or the
test of external attackers on

829
00:39:23,495 –> 00:39:24,455
your environment.

830
00:39:24,455 –> 00:39:26,775
So that foundation is critical.

831
00:39:26,775 –> 00:39:28,775
It is the part
that no, you know,

832
00:39:28,775 –> 00:39:31,175
it’s like the vegetables
or cleaning the garage.

833
00:39:31,175 –> 00:39:34,175
No one likes to deal with
cleaning up identity data.

834
00:39:34,310 –> 00:39:35,910
It’s not the most popular part.

835
00:39:35,910 –> 00:39:38,230
We want to get to the
dessert and see, you know,

836
00:39:38,230 –> 00:39:41,910
dashboards and access
requests granted.

837
00:39:41,910 –> 00:39:44,070
But you need to lay that
foundation first to make sure

838
00:39:44,070 –> 00:39:45,670
that piece is put into play.

839
00:39:45,670 –> 00:39:50,125
And then everything else you do as
you move along will fall into play.

840
00:39:50,125 –> 00:39:52,205
Even things like a
merger and acquisition,

841
00:39:52,205 –> 00:39:55,005
if you think of that absorbing
a whole another organization

842
00:39:55,005 –> 00:39:59,245
into yours, starts with
laying that foundation at the

843
00:39:59,245 –> 00:40:01,565
new organization that
you’re acquiring.

844
00:40:01,565 –> 00:40:03,005
What kind of data do they have?

845
00:40:03,005 –> 00:40:04,845
What’s the condition of
the information there?

846
00:40:04,845 –> 00:40:07,820
What kind of of open
doors am I inviting into

847
00:40:07,820 –> 00:40:11,820
my environment if I simply just
connect our two organizations together?

848
00:40:11,820 –> 00:40:14,620
I need to take
that, that acquired,

849
00:40:14,620 –> 00:40:18,220
organization through the same
process here to get it to a

850
00:40:18,220 –> 00:40:22,255
level of maturity where I am then
comfortable joining to my environment.

851
00:40:22,255 –> 00:40:25,455
Now, when you’ve built all these
processes internally and you’re

852
00:40:25,455 –> 00:40:29,375
using them religiously, it’s
much easier to then add on

853
00:40:29,375 –> 00:40:32,655
another organization and get
the business value out of a

854
00:40:32,655 –> 00:40:35,520
merger much more quickly
without introducing a whole

855
00:40:35,520 –> 00:40:38,520
another layer of
security and risk.

856
00:40:41,120 –> 00:40:43,360
Absolutely. Thank you, Wade.

857
00:40:43,360 –> 00:40:45,440
That’s all we have for today.

858
00:40:45,440 –> 00:40:47,240
Thank you, everyone,
for attending.

859
00:40:48,105 –> 00:40:49,465
Again, if you have
any questions,

860
00:40:49,465 –> 00:40:51,425
please feel free to reach out.

861
00:40:51,865 –> 00:40:55,225
I think we’re out of time a
bit over time, so apologies,

862
00:40:55,225 –> 00:40:58,305
we probably can’t
do a Q and A today.

863
00:40:58,425 –> 00:41:01,225
That said, we look forward
to hearing from you.

864
00:41:01,225 –> 00:41:03,905
We will have some other
sessions coming up

865
00:41:04,090 –> 00:41:07,210
as follow ons to this that
will kind of focus a bit more

866
00:41:07,210 –> 00:41:11,210
in-depth on some of the
specific access excuse me,

867
00:41:11,210 –> 00:41:14,890
specific actions you can take
in order to move towards Zero

868
00:41:14,890 –> 00:41:18,130
Trust. Wade, do you have any
additional information on that?

869
00:41:18,165 –> 00:41:21,445
Just to let you know that
everyone that attended today

870
00:41:21,445 –> 00:41:24,725
will get a copy of the slides
and a recording of our session

871
00:41:24,725 –> 00:41:26,325
today, everyone
that’s registered.

872
00:41:26,325 –> 00:41:28,805
We will be sending out
additional invitations to the

873
00:41:28,805 –> 00:41:32,750
next set of sessions a little
bit after the break for the

874
00:41:32,750 –> 00:41:33,630
beginning of summer.

875
00:41:33,630 –> 00:41:36,030
So you’ll have, again,
as JR mentioned,

876
00:41:36,030 –> 00:41:39,710
more in-depth analysis
and recommendations around

877
00:41:39,710 –> 00:41:42,670
taking the steps forward
to making this work.

878
00:41:42,670 –> 00:41:45,470
But I want to say, first
of all, and lastly, JR,

879
00:41:45,470 –> 00:41:46,190
thank you very much.

880
00:41:46,190 –> 00:41:48,155
It’s always a pleasure
working with you.

881
00:41:48,155 –> 00:41:51,435
I appreciate the depth of your
insights and your knowledge,

882
00:41:51,435 –> 00:41:53,995
and I look forward to
continuing this series with you.

883
00:41:53,995 –> 00:41:56,075
Thank you, Wade. Same
to you. Take care.

884
00:41:56,075 –> 00:41:57,875
All right. Thank you, everybody.