Fly the Plane is how Dr. Timothy Chester, Vice President of Information Technology, The University of Georgia, characterizes his philosophy and approach to cybersecurity readiness. Dr. Chester spoke at length about a proactive approach to information security management anchored on strategic planning, senior leadership commitment, strong teamwork, sophisticated intelligence monitoring, and robust training and testing practices. His candor and reflection made for a most interesting conversation.
To access and download the entire podcast summary with discussion highlights --
https://www.dchatte.com/episode-11-fly-the-plane-a-cios-approach-to-cybersecurity-readiness/
Fly the Plane is how Dr. Timothy Chester, Vice President of Information Technology, The University of Georgia, characterizes his philosophy and approach to cybersecurity readiness. Dr. Chester spoke at length about a proactive approach to information security management anchored on strategic planning, senior leadership commitment, strong teamwork, sophisticated intelligence monitoring, and robust training and testing practices. His candor and reflection made for a most interesting conversation.
To access and download the entire podcast summary with discussion highlights --
https://www.dchatte.com/episode-11-fly-the-plane-a-cios-approach-to-cybersecurity-readiness/
-------------------------------------------------------------------------------------
Connect with Host Dr. Dave Chatterjee and Subscribe to the Podcast
Please subscribe to the podcast so you don't miss any new episodes! And please leave the show a rating if you like what you hear. New episodes release every two weeks.
Connect with Dr. Chatterjee on these platforms:
LinkedIn: https://www.linkedin.com/in/dchatte/
Website: https://dchatte.com/
Cybersecurity Readiness Book: https://www.amazon.com/Cybersecurity-Readiness-Holistic-High-Performance-Approach/dp/1071837338
Welcome to the Cybersecurity Readiness Podcast
series with Dr. Dave Chatterjee. Dr. Chatterjee is the author of
A Holistic and High-Performance
Approach by Sage publishing. He has been studying cybersecurity
for over a decade, authored and edited scholarly papers,
delivered talks, conducted webinars, consulted with
companies, and served on a cybersecurity SWAT team with
chief information security officers. Dr. Chatterjee is an
Associate Professor of Management Information Systems
at the Terry College of Business, the University of
Georgia and Visiting Professor at Duke University's Pratt
School of Engineering.
Hello, everyone. Welcome to this
episode of the Cybersecurity Readiness Podcast. Today I have
the honor of having Dr. Timothy Chester, Vice President of
Information Technology and Chief Information Officer at the
University of Georgia as our guest. A seasoned C level
executive, Dr. Chester has over two decades experiences in state
supported and private higher education institutions. He has
led large-scale business transformation efforts through
on-time, on-budget ERP implementations, driving
increased revenue and improved student outcome through improved
use of data and analytics. He's an expert practitioner in
developing improved information security programs for large
geographically distributed enterprises, with 50,000 plus
users, virtually eliminating data disclosures. Tim is also
highly regarded for leading IT turnarounds, increasing IT's
reputation as a trusted and respected partner in the pursuit
of strategic goals. Last but not the least, Dr. Chester is a
noted author with over a dozen publications in the field.
Welcome to the podcast Tim.
Dave, let me say first, I'm just delighted
to have the chance to be here with you today. And and I've
really enjoyed reading through your book. I've not finished it
yet. But I think you've done a very masterful job of making a
complex subject accessible to a wide, wide audience of business
professionals. And you should be commended for that, and I offer
you my congratulations, and thank you so much for sharing a
copy with me.
Thank you
You know, I think we stress, well, we use a
phrase in my organization quite a bit; in fact, it's part of our
strategic plan. And that phrase is Fly the Plane. And what this
relates back to is an exercise that I learned a long time ago
when I was a graduate studies student at Texas a&m University
30 years ago; was a little bored on the side and had a little
cash to spend. And so I worked towards a pilot's license and
did a lot of single engine plane flying over the farmlands in the
plains of Central Texas. And you learn very early in pilot
training to always fly the plane. And what that means is
that if you are not constantly anticipating and thinking
through what's fixing to happen and what could happen, frankly
the plan will fly you; you'll have a burst of wind that might
come from from a heat thermal that knocks you off course a
little bit you'll have to course correct to kind of get back
there and if you're not proactive, anticipating way the
things will go on, the plane will fly you and you'll react,
and what you'll find over time is that you will react in more
and more stronger ways which creates a negative reaction that
again you have to react to and frankly that's how disasters
happen in flying a plane. So , we have stressed that in our
organization quite a bit and when we say fly the plane what
we simply mean is through strong teamwork and strategic planning
and foresight we try to think through constantly the types of
scenarios that we could be facing. And we tried to plan for
the the little bitty factors that probably aren't a high
probability of occurring but it could be high impact if they do
occur; so, if you log into the website or you go to the website
for UGA's IT organization to see that flying a plane is a real
stated part of our strategic plan, whether we're planning for
the network performance, network load associated with class
registration or thinking through the possibilities of a
ransomware attack on the University.
That's a interest very interesting
metaphor, I love it, flying the plane; you know, it tells me
about the importance of being very prepared, being pro-active,
knowing or rehearsing how best to deal with different
scenarios; so you can't afford to be caught. blindsided; and
talking about can't afford to be caught blindsided, what are some
cybersecurity blind spots? And how do you cope with them?
Right. Well, you and I, both teach
business process management at the University. It's a strong
set of competencies and skills that I think serve our graduates
really, really well. And part of that is what we call root cause
analysis, right? And the thinking is that surface
explanations and surface understandings tend to not be
comprehensive enough. And we as human beings tend to look for
explanations that would suggest that we didn't necessarily have
a lot of power to deal with something, something really,
really, really bad happens and root cause analysis forces you
to continue asking, Why did this happen? Why did that happen till
you get to a level where you have uncovered a set of
conditions in which you actually had a deliberate amount of
control, you could have done something about that. And, but
our human desire to basically live through rote repetition and
structure that's comfortable and unchanging leads us to be
creatures of habit. And again, creatures of habit who are
following the habits and following the rote behaviors
that they always engage in, typically find themselves in
circumstances sometimes where again, the plane starts flying
them and the way in which they react to that plane, you know,
become wilder and wilder swings that could lead to, to a
disaster. I have really found any guy worked in higher
education and state government as a separate vertical industry.
But I think it's true across the other verticals, whether we're
talking about finance or manufacturing, or, or our
commerce is that people are good, they care about their
employers, they want to do a good job. But we as humans,
again, are most comfortable, when structures tend to be
unchanging, and there's nothing really unexpected going on, and
we tend to assume the best and think that the worst will never
really happen. And that tends to create the environment where
really bad things can happen. Now the most serious of
information security incidents, or breaches tend to be like
plane crashes, again, if I continue to use the aeronautic
kind of metaphor here or analogy, and that planes tend to
crash not because one thing happened unexpectedly but
because multiple things happened at the same point in time, which
create a set of circumstances that allowed something really
you know, low frequency to high impact to, to occur. So a lot of
time in the information security space, the blind spots happen
just because the IT industry and the IT culture, within business
places a premium on good customer service and sometimes
good customer service and a focus on functionality of our
systems, and what we do comes at the expense of maintainability,
compatibility, and information security. So you know, we we
have near misses all the time, we have a good team here, that's
proactive that can catch them. And we had we had a near miss
here recently, with some ransomware. And it really was
all about a very good employee working in a very good unit, you
know, probably doing something that they shouldn't have done to
enable some functionality from one of their key players. And
they did that and and they did that some time ago. And then
next thing you know, it's been a while since the machine was
patched and so on and so forth. And just kind of a constant
layering on of things that probably shouldn't have
happened. That created some some real risk and some some real
vulnerability there. And we were very fortunate that we were, we
became aware of those risks before something before they
were really exploited. But But again, going back to earlier,
you know, we're most comfortable, again, with a lot
of structure and a lot of predictability. And that leads
us to sometimes getting very comfortable allowing the plane
to fly us and the plane will fly us really, really fast if we're
not careful.
Mm hmm. Very true. Talking about being
comfortable, and, you know, operating in a predictable
space, you know, when you think about the hackers and how they
are constantly innovating and coming up with the latest
methods and techniques, it's hard to keep up with them. And
again, that's not what organizations are in the
business of, whether it's an academic organization, or
whether it's some other organization, they have their
own mission, their goals. So and of course, you know, there's
always the budgetary constraints. So under the
circumstances, how do folks like you try to ensure that your team
has the latest experience and expertise in keeping up with
these different evolving attack vectors?
That's a great question. I think the
Department of Homeland Security and the Cybersecurity and
Infrastructure Security Agency (CISA), it's a branch of the
Department of Homeland Security, does an exceptionally good job
of creating awareness of a really complex and fast changing
environment. So you know, either through, you know, email, or
through automated feeds and other ways, we get real time
intelligence from the CIA, NSA, multiple times on a daily basis.
So as an executive, I just subscribed to their listservs.
And so today, you know, I've received email messages about
the need to patch vulnerabilities in Google
Chrome. And you know, there's a variety of other commercial
packages out there. So we have, we divide our information
security team up into kind of consulting and helping people
around controls, and then we have an operations arm. And then
a part of that operations arm is around proactively, you know,
patching the environment and creating awareness of the need
to do that. And they help the Institution and its IT staff
stay on, on toes when it comes to this type of changing
environment. The other thing that they do really well also is
they monitor known IP addresses that are out there that are
used, that are known to be distributing malware, or
ransomware, or to be command and control points for existing
installed malware. And, you know, I think, on a daily basis,
or certainly on a weekly basis, we get a feed of those IP
addresses in an automated fashion, our network firewalls
will block ingress and egress both to the to those IP
addresses immediately, which which helps us as well. So I
think that partnership, I think, has been really, really on on
point for helping us stay aware. And then the other thing that we
do is we stay highly engaged with with our counterparts in
the Southeastern Conference schools, as well as our other
peer and aspirational schools and constantly kind of comparing
notes and, and having constant conversations as well as within
the University System of Georgia.
Yeah, that makes a lot of sense. What about
the rest of the community? Your community in the field of
technology obviously, that's part of your job description,
you have to be on top of your game. But what kind of help and
support can you expect from the other business units, as well as
the individual stakeholders, whether it's faculty members,
whether it's students? What could or should they be doing to
help secure the environment?
Well, for somebody in my role, or for the
CISO role, and one of the most critical things is that you have
executive leadership that understands these
responsibilities aren't siloed responsibilities for the IT
folk, but they are business responsibilities that are shared
by everyone. And I think in the state of Georgia, frankly, that
recognition and that supporting philosophy starts at the top.
Governor Brian Kemp has been a very strong supporter and
advocate across the board for all state institutions to really
raise the game in terms of their cybersecurity defenses, and he
has been quite explicit that it is the division heads and the
CEOs of those major divisions, including the Chancellor of the
University System of Georgia who are ultimately responsible for
assuring the state and the state government that we are doing all
we can to reduce risk and to have the types of controls
around technology and its use that we need to have. Certainly
within the University of Georgia, in the 10 years that I
have been here, we've enjoyed that type of support from the
top, from President Jerry Morehead. He was the number two
at the University 10 years ago when I was hired here and I
worked for him directly for a couple years and now I continue
to work for the Provost, the number two here at the
University. And so that tone really starts at the top and I
can tell you that you know, we have division heads here we call
them Deans or Vice Presidents, they all understand that they
are ultimately responsible to the Institution for managing the
risk, and that my office is a resource and it's supporting arm
but it's a supporting arm, it's not solely responsible for
managing the risk in and of itself. That tone gets set
constantly where the we're doing things with security awareness
training would begin under Governor camps leadership we now
do twice a year. And under the leadership of acting Chancellor
Teresa McCartney at the University System of Georgia
level, there's been a sizable investment in new infrastructure
and supporting platforms for the cyber security training that
Governor Kemp requires us to do twice a year. And so I'm really
fortunate again, I spent the last two days actually before
recording this podcast at the meeting of my counterparts in
the Southeastern Conference, and I think the mix of support and,
and, and real advocacy around information security that we
enjoy at all levels of government have been very, very
helpful to us.
That's very, very assuring, that's good
to hear that you have great support from top management. So,
you know Tim, you were mentioning about my book, one of
the things I've emphasized in the book, which I have gathered
through my research is the importance of hands-on top
management. And I've seen in many companies, the exemplars
where the senior management take on active roles, whether it's in
the aspects of cybersecurity planning, strategizing,
performance review, they obviously are not experts, they
don't claim to be experts, but they try to stay on top of
things. It seems from from what you shared, that's the way your
organization functions. That's the kind of support you have.
Anything that you'd like to add for people who are listening in
and who feel a little frustrated or letdown that they don't see
that level of active commitment. It's a sensitive topic. But I
still thought of probing a little further because,
Yeah, yeah, well, I think part of this may
be is that when, when our president Jerry Morehead was the
provost of the university, and in fact responsible for most of
the operations here at the university, we were getting
burned constantly by cybersecurity incidents. And I
think that created an awareness in him of, of the need to make
sure that this was something that all executives understood
was part of their responsibility to manage well, and I'm not
going to, you know, curse us by mentioning how long it's been
since we've had a major incident. You know, we have near
misses all the time, just like everybody else. But again, I
think that constant diligence coming from the business side,
where we do have an understanding that these are
business responsibilities, first and foremost. So it's been
absolutely critical. I do think it's also really, really
important in terms of ultimately who whoever within an
organization has, the final responsibility and
accountability for these types of risk management activities,
has to basically set at the executive team with the CEO, and
whatever form exists when the when the organization so so I
report to the number two here at the university who's responsible
for academic operations, which is again, 70% of the university.
You know, but the President has staff meeting every two weeks,
and I'm a part of that staff meeting and I have the
opportunity to raise awareness of issues bring visibility to
things that should be be be visible to everyone to be an
advocate for, for sound practices. And, and then, you
know, I'm on a texting and cell phone relationship with the
president, whenever I need to get his attention to some
matter, the President is pretty easy to reach. In fact, you
know, last night coming back from my meeting with my
counterparts in the sec, you know, I debrief the President
on, you know, later a phone call in the evening to kind of
compare notes with things that are going on. So, I do think,
you know, CEOs really understand that these are things that they
have to manage, and frankly, if they don't manage Well, they are
things that wreck careers. And so frankly, that helps, right?
So you go back 10 years ago, a major secure cyber security
problem inside of a business probably, you know, the CIO or
the CIO, or both of them, you know, they are the two parts of
the operation that really had some career risk there. You
know, that that believe that that that awareness that extends
all throughout the organization and certainly at the executive
level, to go back to the the governor of the state, the CEO
of our great state of Georgia, you know, he he had a couple of
incidents on his watch when he was the secretary of state, and
I think he handled The response to those incredibly well, he, he
left the place better than he inherited it. And he has brought
that awareness to all arms at all levels of the state
government, which has been truly, truly helpful.
Yep, that is extremely important, you are
kind of speaking to a couple of things that I emphasize a lot.
One being joint ownership and accountability. And the other is
trying to create that We-Are-In-It-Together culture,
where everybody has to recognize that it's not ITs job or the
information security units job to protect us, we also have a
role to play. It's like the way we are fighting COVID, you know,
we can't just sit back and expect miracles to happen, we
have to recognize our roles, and do our part. From the standpoint
of enhancing level of awareness, you mentioned about, you know,
conducting awareness training twice a year. And that's great.
Now, there is a lot of research out there that speaks to the
importance of customized training, that speaks to the
importance of, you know, role based training, training that
shouldn't be one shot, because people often don't remember the
first time what they were trained in. And and then another
aspect that often doesn't get addressed is are you effectively
measuring the effectiveness of the training? And I know, I
asked you several sub questions, but, you know, take it the way
you're comfortable.
Yeah, I think there's a couple things, I
think we're raising the bar, right. And I mentioned earlier,
this investment in kind of the training and awareness platform
that the University System of Georgia has made, that platform
has a lot of capabilities around, you know, simulate
malware campaigns, and some other kind of tools to really
take an exercise approach to, you know, to helping to kind of
raise the awareness or for your organization. I think the
information security training that we have done in the past
has been quite rote, and frankly, not as polished as it
could be. And this investment of resources by the system, I
think, is really going to raise the bar quite a bit there for
us. And, you know, between that, and I think the commitment from
the executive level organization, I think it's, it's
been, it's really, we have a, we have a quite optimal environment
here at the University of Georgia right now to kind of
continue moving the needle here.
Now, from a communication standpoint, you
know, as a member of the University community, I will
often receive cybersecurity related communications, and, you
know, they're often a long email, and I can, I can
understand that, you know, certain things need to be
mentioned. Now, it's quite possible that when somebody
receives a long email, they might be skimming through it or
might be reading parts of or might just ignore it. You will
appreciate that part of effective communication is to
ensure that the message really gets across to the appropriate
folks. So, keeping that in mind, how do you make cybersecurity
communication more customized and more effective? Have you all
been giving this some thought?
Yeah, well, I think we certainly understand
that we need to do a lot better job at this. You know,
typically, you know, we we have a very structured communication,
you know, management program that goes around our initiatives
and our operations that's designed to raise awareness but
I think you hit the nail on the head is that sometimes those
communications are written from the standpoint of IT folk, which
you know, sometimes uses vocabulary and acronyms that
really aren't well understood. And readers tend to disengage
pretty quickly from that, frankly, the whole question of
whether or not email is the best vehicle for communicating these
things, also continues to be a concern, people don't read email
as much as they used to, and the longer the email, the less,
you're likely to get the message across. So, you know, I think
trying to raise messaging that's targeted to more smaller
audiences is something that we're trying to do. And there's
some upgrades to our multifactor system that we're trying to be
very specific and targeted, as opposed to global like
communications. The other thing we have to do is just make when
we communicate to people, we have to do so in a context that,
you know, is accessible and relevant, you know, through
narrative, you know, what's at stake for me, and what do I have
in this and, again, it has to be very personalized as well. And
again, I think we've got real opportunities to get much, much
better at that. When I came here 10 years ago, the knock used to
be well, you never told us anything that we were doing
this, you know, now we beat people over the head with
communications. But I still wonder sometimes whether the
message is truly getting through. And the use of social
media is becoming an important part of that as well. Although
I'm not, you know, sending a mass listserv to 50,000 people
versus posting something on Twitter, you know, to a much
smaller audience repetitively I'm not sure the the social
media gets this broader reach. But, you know, we're trying to
take multiple avenues and use multiple, you know, tags at the
messaging to more specific audiences to get the word out,
get the word across.
That's great to hear, you know, for
instance, from a faculty members perspective, you know, it'd be
good to know that, given the role I play at the university,
what are some do's and don'ts from a cyber security
standpoint? Now, is this information not available, no it
is available, it's out there, but to get it in my inbox in a
very targeted manner, and then, from time to time being reminded
that these are the things that you should focus on. That helps
simplify things a little bit, as compared to a broad brush
approach, where you're being told, what are the sensitive
assets, and what are some scenarios that you should be
careful about. That's a little too generic. So that's just my
two cents. But I appreciate the candor and the recognition that
we can do better.
So the other I just added that really,
really quickly. I mean, so we get we were at Auburn University
for this meeting of my counterparts in the last couple
of days. Auburn has done a really good job with messaging
around posters on entryways, you know, for their computer labs,
screensavers, and things like that. And that's probably
another opportunity where we need to get the word out a lot,
a lot more.
That's a good, that's a good approach.
That's a great approach indeed. All right, so the next topic
that is also very close to my heart, is security audits and
drills. You know, something that I talk about a lot when I'm out
there, I say, you know, we have fire drills, do we have
information security drills? Do we plan for distributed denial
of service attacks and, and ransomware attacks? And now, I
know it's easier said than done, and organizations do tabletop
exercises, but in your role as the the person, the technology
Person of the university, are you happy with the rehearsals
that we have in place? Or can we do better?
Yeah, you know, I think we think we're
doing well here; we certainly always can do better, but, you
know, we really have implemented, you know, kind of
the gold standard approach to to, to a security operation
center. And a part of that center is, you know, a red team
versus a blue team and the red team are the friendly hackers
who you know, are empowered to probe our ourselves and our
systems and look for vulnerabilities and so, again,
being here, you know, at an institution we are able to
employ graduate students, we are able to employ undergraduate
students, as well as some professional employees and so we
are constantly trying to hack the hell out of ourselves, using
many of the common methods that are out there and you know,
moving the needle in terms of not only just penetration but
also thinking about malware and ransomware there there are some
tools out there that are now available we're looking at
acquiring which well you know, with with some intelligence
agents scattered around your enterprise will tell you really
quickly how easy it is to drop malware and other things. So we
are constantly hoping to discover the major risks and
vulnerabilities we have before others do; and again we're not
perfect yet; we're so big, we often miss things. But there's a
huge investment in resources to do that. And it is always you
know, I have to be careful about some of the stories I would
share but again, you will appreciate this given your
expertise and your rich experience in consulting, many
times when vendors and implementers you know, install
major infrastructure on campus and they walk away from they
flip the switch on, they don't change the default password to
things. And so we've discovered major things here at the
University from from Hvac equipment to scoreboard and
athletic venues, that if you knew what kind of make and model
the thing was and you knew how to use Google to find the, the
the manual of instructions and how to go find that and get the
default username and password. If you're on campus, you could
actually control that stuff. And, you know, there have been
several vulnerabilities like that that had been discovered.
And you know, really that goes right back to the question of
blind spots, right? So you got an implementer, my job is to
implement and turn it on, they'll figure that other stuff
out. And then you got customers who bought for; well, we paid
these experts to do it. So they had to do it, right, we're in
good shape. There's a blind spot between two well intentioned
good groups of people working their best to do a hard job. And
so again, constantly attacking ourselves, again, using the well
understood red team approach is something we are very aggressive
with.
Yep, that's, that's very true; and
talking about vulnerabilities and talking about discovering
vulnerabilities, another you know, area of great concern to
me is, we keep reading about these stories in the media that
this organization was made aware, but did nothing about it
until it happened, right. And so I wonder, from an operation
standpoint, I'm sure you all have a mechanism in place where
you're logging all the intelligence you're receiving,
and then you are evaluating them, and then either acting or
not acting, but at least you're on record explaining your reason
for your decisions. So this way, you're maintaining a rigorous
record of how you handling intelligence, which later on,
I'm not a legal expert, but I think, you know, if you had to
defend the organization, you could say that we've done
everything, and this is how we thought during that period of
time. So you kind of backup your actions, your reactions to that?
Yeah, and let me just give you a little
context. First, you know, research flagships like the
University of Georgia, you know, our, our, you know, vertical
industries, like finance, or manufacturing, we are Research
and Innovation conglomerates. And we have 18 major units here
at the institution, colleges and schools that are invested in
innovation in their fields. So we allow for a wide variety of
different than non standard approaches to running
technology, because it supports Research and Engineering or
business, research in the areas that you do Dave, public health,
so on, and so forth. So that kind of very distributed non
standardized environments increases risk dramatically. But
we have a couple of basic gatekeeping rules around that;
to begin with, everybody's got to run our antivirus. And
everybody's got to send their logs to our Security Operations
Center. And the tools just for data mining and analysis around
those logs, just continues to get better and better and
better, better. So So again, one of the one of the benefits for
making everybody use the same standard antivirus engine, we
don't allow people to buy other antivirus products, is that we
get just incredibly centralized logging about packets that are
downloaded from the internet. And many times, you know, we
will we will see something through our intelligence, the
end user is not aware of and we can take action from that.
There's a there's another very good product that is being
commercialized by a computer science faculty at Georgia Tech,
he was formerly at the University of Georgia that that
very helpful in this space. And then again, kind of on the
reactive side, as well, the ransomware near miss that we
had, these new data mining tools are very good at looking for
lateral moves through the network environment by people
who've breached the environments that they did it, if they moved
anywhere. And you know, again, it's kind of a big data
collection effort, right, you've got hundreds, if not 1000s, of
endpoints, all logging things. And if you can capture all that
data with the tools, you can get a fuller sense of what's going
on. But again, it is absolutely amazing, you know, used to we
would have to write our own scripts to kind of look for
things and then the tools come with standard templates. Now the
tools come with AI and machine learning, that merges all of
those things together to really give us a proactive sense. Now
these tools are expensive, they are absolutely expensive. But
you know, they're well well worth it. And it's a fast
maturity field. And again, we're very fortunate to operate in an
environment with a senior administration that that that
supports us with the resources necessary to be in this space.
We are early adopters.
Very very, very good to hear that. You
know, you talked about all kinds of data and analytics that's
available to us now, that brings to mind performance measures and
metrics. And this is another one of those areas where it's very
hard to learn. Or it seems that organizations are struggling in
terms of identifying what measures or metrics to capture
and monitor when it comes to cybersecurity performance.
What's your take on that?
You know, and this is, this is an area
that I am not necessarily a subject matter expert, as well
as I should be. I have a really strong information security
team, and I trust their judgment and, and in some areas, I'm
really just the gatekeeper. I'm not the gatekeeper. But I am the
guard rails, rail rail network. So thinking about these KPIs,
frankly, the most important KPI that I'm aware of is have we had
a major breach that resulted in either increased vulnerabilities
or an increased reputational damage or real damage to the
institution and its customers. And that is one certainly that I
keep in my pocket, as well. But But everything else from number
of users types of end users, types of access, that that's
managed by those users, you know, metrics around how we
properly decommission accounts, when people some people exit the
community is absolutely critical. As well as, you know,
stats on, you know, volume of patching, you know, what's our
time to patch for, you know, a certain grade a patch with
medium risk versus low risk versus critical risk? And those
are all, I think, really, really important as well, the most
important one, which is the one that the the CEO cares about
most that I do is number of incidents, and how many have we
we had and and first and foremost, that's one thing I
keep in mind all the time.
Yeah, yeah. Along those lines, if, you know,
if you were to think about rewards and incentive systems,
it's a reward in itself if cyberattacks didn't happen that
that is that goes without saying, but do you have any
thoughts about it, because in reality, it helps to motivate a
certain desired behavior. Any thoughts on what would be some
good rewards and incentive systems to achieve the desired
behavior across the organization, when it's not your
job function?
Unfortunately, I think this is an opportunity
for the whole profession more than anything else. Because you
know, right now, we probably have more sticks than we have
carrots. Unfortunately, I mean, one of the ways we keep our you
know, our Dean's and our vice president attention on these
matters is simply because they know if there's an incident on
their watch, you know, they're going to be in the general
counsel's office with me and some of my folks, the
president's chief of staff, as we begin root causing how
whatever happened actually happened. And that's an
uncomfortable seat to be in for the three or four Dean's that
have, that I've been in the room with, when we've had to do that.
And, you know, that, you know, accountability works. It's, it's
really, really, really, really important. But I think the other
thing that we do, and it's more, not necessarily secondary, but
indirect, kind of, you know, carrot or incentive is just
really empower user to try you know, you know, particularly
with the researcher in a lab and, you know, or whether we're
talking about the vet school or in chemistry or something like
that, by just basically helping them understand how this works,
good security practices work and, and how they really can
enable them to do some innovative things without
artificial controls and barriers from on top here at the
institution. I think that really creates an incentive for people
to you know, have really good baselines around information
security in their in their operations. So we certainly try
to take that as well. Again, sharing data from these meetings
I just come out of you know, we we do we trust our users a lot
more here at the Institution, and we do some things,
compensating controls, which I could get into at the network
level, that give us the ability to have more flexibility at the
endpoint level, which we're very, very comfortable with,
but, but again, I went to graduate school at Texas A&M, I
started my career in IT at that organization with some great
mentors, people that your listeners won't know but but
gentlemen, Tom Putnam, Steve Williams, Pierce Cantrell,
they're really giants in my eyes of our discipline. And the thing
that they all kind of really baked into my noggin is that
research institutions are research and innovation
conglomerates, and you have to allow faculty have the room to
innovate. Otherwise, you You know, you're defeating the
whole, you know, mission of search and innovation at the
institution. So we do a lot more aggressive things a lot, a lot
more things with tools that are quite expensive at the network
level. That means we don't micromanage the endpoints in our
environment where a lot of other schools are actively trying to
manage risk by managing endpoints and again, making sure
that we provide faculty members and staff members the
flexibility to use tools as they best see fit to carry out their
job or their their research, I think is one of the most
important incentives that we can have.
Yep, that is very true. And and in that
spirit of empowering the users to be able to continue their
mission to why they are the institution, like we said, at
the very beginning. We are not here in the business of
security, we're in the business of doing what we do. But we
cannot ignore security, security is centric to ensuring that we
can do all our jobs. well. I'd like to probe into another area
that's about empowering the chief information security
officer. It is my belief that you are the head of technology
of IT at the institution, the CISO reports to you, is that
correct? He does. Okay, so how do you ensure that because, you
know, again, the research literature talks about trying to
keep the CISO, CISO function, as objective as possible, the CISO
should have a direct reporting relationship to the C level,
folks. Again, this is a murky area, you can do it in different
ways. what's what's your sense about CISO empowerment?
Yeah, you know, I think
I think what we have here at the University of Georgia works
because of the leadership, you know, tone that the President
sets and the way he's organized this team in a very
collaborative way. And it's not necessarily replicable at
institution for that the culture with that kind of that kind of
leadership tone that that gets that so what President Morehead
is looking for in all of his vice presidents is an ultimate
and final authority over their areas, right, subject to his
review or his his perspective on on any matter. So, from a
university governance standpoint, I am that final
subject matter expert, when it comes to IT matters. And
President Morehead that does include information security
matters as as as well. And so that means I have signature
authority over policy. But you know, that's, you know, it's a
servant leadership role. It's not a, particularly in a
collaborative environment, like universities, it's not
necessarily a hierarchical role at all. But within my team, you
know, we're very non hierarchical as well. I know you
know, the University of Texas System, for example, has a
system wide rule that says that the CISO cannot report to IT
because what the concern always is, is that information security
kind of gets buried under the weight of fulfilling customer
service requests and demands for functionality and that's why you
would split those roles off; so the University of Texas System
has done that for all of its counterparts. And
philosophically, I don't think it's the best mix because I
think when you do that, yes you gain some some increased
visibility with that organizational structure but you
tended to divorce security a bit from from operations; now now
they have done this at the University System of Georgia as
well but just for their office alone and and so the CISO at
that point, when you do it that way, they almost always always
focused on controls and standards at the expense of
operations. And I worry and this is President Morehead's genius,
what he doesn't want from the vice presidents or the deans is
a lot of finger pointing, so if there's an information security
thing that goes on he doesn't want two subject matter
authorities pointing the finger at each other and security
saying these darn IT folks if they'd get their act together we
would be okay and the IT folks saying I'll security people over
there this is their deal their silo, not ours. And so you know,
again, it's not just in IT, the VP for Student Affairs is the
final authority over student affairs, the VP for instruction
over instruction and teaching and, and and so on. So I, we
run, but again, what works for us doesn't work elsewhere
particularly would not work in a very hierarchical organization.
So I know some CIOs who basically have a team that
direct reports. And, you know, they'll bring that team of
direct reports together once every two or three months to
have a staff meeting, and they'll meet with everybody
individually. My team meets with me once a week, everybody's in
the room. And everybody knows I have a responsibility to
understand how they can be supportive of everyone else and
really understand the independencies they have on
everybody else, including information security. They also
meet without me once a week on their own as well, I think they
do that to try to figure out how to collectively manage me better
or something like that. But it's a very non hierarchical, very
collaborative, everyone around the table has an equal seat and
equal voice on the matter. And that mirrors the way the
President runs the University. If CISO was buried under me in a
very hierarchical way, that may be that may be really, really
grounds for concerns, but but again, because of my style, and
approach, Ben Myers, the CISO, he has his own relationship with
the general counsel. He has his own relationship with Deans', I
don't gatekeep him from collaborating and relationships
around here. I guess the only only area that I would gatekeep
him around access is access to the President of staff meeting,
but that's the way the President runs the meeting, you know,
we're going to bring, if we're going to bring somebody to the
meeting, it comes through us so but for what so what we have
worked through us this is this is a field that's that's fast
changing. And so I know what the University of Texas has going on
is working for them. And, and then frankly, I'll also say the
University System of Georgia really began moving the needle
from a policy and control standpoint, when they separated
out information security from from IT operations until I think
what they've done, that's working for them also.
Wonderful, Tim, thank you so much for your
time, this has been extremely enlightening. We've covered a
lot of areas. Any final thoughts, yeah, you've covered a
lot of ground. Any final thoughts?
I you know, I think this is one of the most
interesting and dynamic fields that there is in IT and I tell
my students that, you know, if you want a super career for the
next 20 years, guaranteed, this is a space to really explore,
you don't have to be incredibly technical, you have to be
technical enough to know what's going on at least the 25,000
foot view and up. But it is it's real opportunity. And again, I
was in a staff meeting with the CISL and my team to get the day
and just hearing a report on some of the new investments they
would like to make in tools and how AI is fast evolving as a
threat monitor is just absolutely incredible. And also
from a student standpoint, I'm a huge advocate for them thinking
about this space and investing in it. And, you know, again,
I've been fortunate to work for great people and for great
organizations. And having been here at the University of
Georgia now for 10 years red flag runs in my blood. And I
consider myself very fortunate to be able to do the job that
I've done. But I do it knowing that I'm a caretaker for a while
and not going to going to I'm going to leave it to somebody at
some point. And the thing that I've tried to do is to leave an
organization and a team and and in a pool of talent that gets
the job done. And I think we're making that work today really
well.
Fantastic. And Tim, thank you for what you
do for the Institution. It's been a pleasure to work with you
as a colleague and thank you again for doing this podcast
with me today.
It's been a pleasure. Thank you.
A special thanks to Dr. Timothy Chester,
for his time and insights. If you like what you heard, please
leave the podcast a rating and share it with your network. Also
subscribe to the show, so you don't miss any new episodes.
Thank you for listening, and I'll see you in the next
episode.
The information contained in this podcast is for
general guidance only. The discussants assume no
responsibility or liability for any errors or omissions in the
content of this podcast. The information contained in this
podcast is provided on an AS IS BASIS with no guarantee of
completeness, accuracy, usefulness, or timeliness. The
opinions and recommendations expressed in this podcast are
those of the discussants and not of any organization.