Episode 78
Earned Trust: Reimagining Data Security in the Zero Trust Era with JR Williamson
Have you heard? Data is the new oil. JR Williamson, Senior Vice President and Chief Information Security Officer at Leidos, is here to explain where data’s value comes from, the data lifecycle and why it is essential for organizations to understand both of those things in order to protect this valuable resource. Join us as JR breaks it all down and also explores the concept he dubbed “risktasity,” which he uses to describe the elasticity of rigor based on risk. As he says, “when risk is high, rigor should be high, but when risk is low, rigor should be low.”
Key Topics
- 00:00 Migration to the cloud has increased vulnerability.
- 04:50 People want decentralized work, including mobile access.
- 08:14 Shift from application to democratizing access to data.
- 10:53 Identify, protect, and manage sensitive corporate information.
- 13:49 Data life cycle: creation, management, access, evolution.
- 20:10 Computers augmenting humans, making good decisions, insights.
- 23:19 The importance of data in gaining advantage.
- 27:04 Adapting to AI to anticipate and prevent breaches.
- 28:51 Adoption of large language models in technology.
- 33:03 Identity and access management extends beyond authentication.
- 36:33 Leveraging strengths, improving weaknesses in tennis strategy.
Tracing the Cybersecurity Evolution and Data's Ascendancy
Evolution of Cybersecurity
JR provided a snapshot into the past, comparing cybersecurity practices from the 1990s to what we see today. With 37 years of experience, he recalled a time when IT systems were centralized and the attack surfaces were significantly smaller. Contrasting this with the present scenario, he spoke about the current state where the migration to cloud services has expanded the attack surface. JR noted an increase in the complexity of cyber threats due to the widespread distribution of networks. Plus, the need for anytime-anywhere access to data. He stressed the transition from a focus on network security to a data-centric approach, where protecting data wherever it resides has become a paramount concern.
Data Life Cycle: "So part of understanding, the data itself is the data's life cycle. How does it get created? And how does it get managed? How does it evolve? What is its life cycle cradle to grave? Who needs access to it? And when they need access to it, where do they need access to it? It's part of its evolution. Does it get transformed? And sometimes back to the risktasity model, the data may enter the content life cycle here at some level. But then over its evolution may raise, up higher." — JR Williamson
The New Oil: Data
In the world JR navigates, data is akin to oil. A resource that when refined, can power decisions and create strategic advantages. He passionately elucidated on the essence of data, not just as standalone bits and bytes, but as a precursor to insights that drive informed decisions. Addressing the comparison between data and oil, JR stressed that the real value emerges from what the data is transformed into; actionable insights for decision-making. Whether it's about responding with agility in competitive marketplaces or in the context of national defense, delivering insights at an unmatched speed is where significant triumphs are secured.
Importance of Data Security
JR Williamson on Data and "Risktasity"
JR Williamson stresses the heightened necessity of enforcing security measures that accompany data wherever it resides. As the IT landscape has evolved, the focus has broadened from a traditional, perimeter-based security approach towards more data-centric strategies. He articulates the complexity that comes with managing and safeguarding data in a dispersed environment. Where data no longer resides within the confines of a controlled network but spans across a myriad of locations, endpoints and even devices. This shift has rendered traditional security models somewhat obsolete, necessitating a more nuanced approach that can adapt to the dynamic nature of data.
The Value of Data in Decision-Making: "The data in and of itself is really not that valuable. Just like oil in and of itself is not that valuable. But what that oil can be transformed into is what's really important, and that's really the concept." — JR Williamson
Data Security Experiences
Both Mark and Carolyn resonate with JR's insights, drawing parallels to their own experiences in cybersecurity. Mark appreciates the straightforwardness of JR’s "risktasity" model which advocates for proportional security measures based on the evaluated risk. This principle challenges the one-size-fits-all approach to cybersecurity, fostering a more tailored and efficient allocation of resources. Carolyn, in turn, connects to the conversation with her history of grappling with the intricacies of data classification and control. She acknowledges the tactical significance of understanding which data warrants more stringent protection. Plus, the operational adjustments required to uphold security while enabling access and utility.
Data Governance and Security Strategies
Understanding Data Security and Lifecycle
JR emphasizes the importance of understanding the data's lifecycle. Acknowledging that comprehensive knowledge about how data is created, managed and ultimately disposed of is a cornerstone of effective cybersecurity. This involves not only recognizing the data's trajectory but also identifying who needs access to it, under what conditions, and how it may evolve or be transformed throughout its lifecycle. By establishing such a deep understanding, JR suggests that it becomes possible to design governance systems that are not only effective in theory, but also practical and integrated into the daily operations of an organization.
Strategy and Organizational Support
Transitioning from a theoretical framework to practical execution, JR discusses the necessity of an effective data protection model that can operationalize the overarching strategy. To accomplish this, an organization must develop a structure that aligns with and supports the strategic objectives. JR identifies that existing structures often serve as the most significant barriers when agencies work on implementing new cybersecurity strategies. Organizations must be prepared to confront and renovate legacy systems and management frameworks. This is a challenge that became increasingly evident as organizations rapidly shifted to cloud services to accommodate remote work during the pandemic.
Insights from Data Security and AI Impact
Transformation of Data into Actionable Insights
Like oil, data's true value isn't in its raw form. It is in the conversion process, which transforms it into insights for decision-making. He reflects on the progression of data turning into information, which then evolves into knowledge, culminating in actionable insights. Just as the versatility of oil lies in its ability to be refined into various fuels and materials, the potential of data is unlocked when it is analyzed and distilled into insights that inform crucial decisions. JR emphasizes that the effectiveness of insights hinges not just on accuracy. It is also on understanding the context in which these insights are applied. He suggests that these refined insights are close to competitive advantages. They enable quicker and more informed decision making in mission critical environments.
The Importance of Data Insight in Business: "Getting the insight in and of itself is important. But combining that insight with understanding of the problem we're trying to solve is really where the competitive advantage comes into play." — JR Williamson
AI's Speed Impact on Cybersecurity and Defense
JR expresses apprehension regarding artificial intelligence's acceleration and its implications for cybersecurity and defense. This unease stems from AI's capability to operate at a pace vastly superior to human capacity. Such rapid capabilities could lead to a perpetual struggle for cybersecurity professionals, who are tasked with defending against AI-driven attacks that continually outpace their responses. For organizations to not only protect themselves but also remain competitive, JR advocates for the adoption of similar AI technologies. By leveraging advanced tools, organizations can preemptively identify vulnerabilities and secure them before they are exploited by adversaries. He alludes to an emerging arms race in cybersecurity, driven by AI advancements that necessitate a proactive rather than reactive approach to digital threats.
Shifting Mindset in Data Security and Zero Trust Architecture
Broader Perspective on Defensive Data Security
Carolyn and Mark, touching on the complexities of cybersecurity, speculate about a potential paradigm shift. Rather than focusing solely on prevention, they wonder if the strategy might pivot towards containment and control once threats are within the system. JR agrees that in today's vast and interconnected digital environment, absolute prevention is increasingly challenging. Though cybersecurity has traditionally been likened to reinforcing a castle's walls, JR argues that due to the dispersed nature of modern networks and cloud computing, this approach is becoming outdated. Instead, organizations need to be agile and resilient, with security measures embedded within the data and applications themselves, ensuring they can quickly detect, mitigate and recover from breaches.
Dissecting the Concept of Zero Trust Architecture
JR expresses discontent with the term "zero trust" due to its implications of offering no trust whatsoever, which would stifle any exchange of information. He advocates for the terms "earned trust" or "managed trust" to more aptly describe the nuanced relationship between users and the systems they interact with. Security architecture, JR illustrates, should not solely rely on verifying users' identities. It has to account for the integrity and security posture of the devices and locations being used to access the data. By meticulously understanding which data are most sensitive and their lifecycles, organizations can ensure that access controls are rigorously applied where necessary. This is based on the type of data, the user's context and the access environment. This nuanced approach is fundamental in constructing a robust and adaptive zero trust architecture that evolves along with the organizational ecosystem.
About Our Guests
JR Williamson is accountable for information security strategy, business enablement, governance, risk, cybersecurity operations and classified IT at Leidos. JR is a CISSP and Six Sigma Black Belt. He serves on the Microsoft CSO Council, the Security 50, the Gartner Advisory Board, the Executive Security Action Forum Program Committee, and the DIB Sector Coordinating Council. He is also part of the WashingtonExec CISOs, the Evanta CISO Council, the National Security Agency Enduring Security Framework team, and is the Chairman of the Board of the Internet Security Alliance.
Episode Links
Transcript
This morning, we get to talk to, JR Williamson, who is the senior vice president and CISO at Leidos, not to be confused with CISO because I was just schooled in that pronunciation. There are no sissies. It's the CISO. Right?
JR Williamson [:That's right. No sissies in cybersecurity.
JR Williamson [:These roles are not for the faint of heart.
Carolyn Ford [:That is right. So senior vice president and CISO at Leidos, JR has more than 37 years of experience in IT engineering and cybersecurity, including almost three decades at Northrop Grumman. And recently, he spoke on a panel at Billington Cybersecurity Summit, which, Mark, you and I both had the pleasure of attending.
Mark Senell [:That was great.
Carolyn Ford [:Yeah. It was really good. It was all about protecting data in a zero trust world. And, he shared some really interesting perspectives on data security. And there were some surprises in that panel, I'm not gonna lie, that we'll get to. But we're really excited to be able to revisit the topic with JR today, unpack it a little bit more on Tech Transforms, and, talk about how cybersecurity has evolved over time and how it continues to evolve with new technological advancements. So welcome to Tech Transforms, JR.
JR Williamson [:Thank you. I'm very pleased to be here.
Mark Senell [:Good to see you, JR.
Carolyn Ford [:Yeah. So let's let's jump right into it. I as I mentioned, you've had a tremendous career focused on IT engineering and information security with almost three well, over three decades worth of experience in the field. So where have you seen the biggest evolution in cybersecurity practices? And let's go back to the 90s. I know that it's hard to think that cybersecurity was around then, but it really was. And I'll have you talk about that. Are there any practices or trends that you foresaw, as well as any that you didn't see coming that really took you by surprise.
JR Williamson [:Well, I think yes to all of those things. I mean, I think the hard part, if you go back to the nineties, Is everything was sort of centralized. You know, we were shifting from mainframes to distributed systems, but distributed systems were still in our networks and still under our control, and so the attack surface back then was really so much smaller and what's changed, of course, is this whole migration. You know, the cloud the cloud. We've gotta go to the cloud. and in so doing, you know, we've completely opened up the attack surface.
JR Williamson [:And at first, we were very focused on our networks and our network's okay. And then we opened up applications and said, hey. Well, are application's okay. Are are servers okay? and then, of course, it all became about the data, and where is our data, who has access to our data, etcetera. And I don't even wanna get into yet the sorta endpoints themselves. You know? The problem that as I distribute my applications stations of my networks and my data and everything's all over the place. Well, now my access to that from anywhere, any place, anytime. Right? Isn't that the mantra? Any place, anywhere, anytime, which means the endpoints themselves are important.
JR Williamson [:So now it's less about just who you are and the identity. Now it's about the application, the integrity, the posture management, your geolocation, you know, all these factors that now come into play, around is my data okay. So that world has changed significantly since the since the nineties. And that's, of course, increased our risk posture. And it's forced us to rethink about how do we protect what's most sensitive, and it's no longer about just protecting networks.
Carolyn Ford [:Yeah. It used to be, like, the castle and moat analogy that we're all so familiar with, but it really I mean, that was it. Right? We had our we had our house. We had our domain, and we protected it. And that's, like you said, we're decentralized now.
JR Williamson [:Highly decentralized and distributed. And people wanna work differently, and it's not just the treatise around mobility. People wanna work on mobile devices and things they wear on their wrist, things they embed into their bodies or attached to themselves. You know? So sure. All that is the case, but I think what's really changing now is the fact that when we have that access, that access can be much more easily sorta taken over because it's so ubiquitous. We're now having to defend against all these variety of fronts, to access that data and verify and validate that it is a valid user who's got appropriate access from an appropriate location, it's okay for them to have access to this type of data and to do the function that they wanna do. And I think the the thing we talked about at the Billington panel too, which is a big worry beat for me, is sure.
JR Williamson [:Yeah. Okay. Now I got it. My head's now wrapped around the data and that this intersection with users and nonhuman users, synthetic processes interact with this data. So, okay, I'm getting that, and I gotta I gotta wrap these controls around and protect it wherever it exists. And then the rise of the API, because as more application and data move into the cloud and that data provided is not under your control in a typical infrastructure service, but now it's all software as a service. So you're really counting on and relying on that a provider to do all those security things too for you, on your behalf. And then wait for it.
JR Williamson [:Application to application. To SaaS and we don't see those interactions. You know, when the SaaS provider I made a contract and a deal with to protect my data and provide access to that data according to our rules, and our governance, and then they choose to have an agreement with a third party who's gonna add value to their offering. I don't see any of that.
Carolyn Ford [:Yeah. I don't get to agree to that supply chain.
JR Williamson [:My rules don't come into play.
Carolyn Ford [:Yeah. You just brought out all those can of worms.
Mark Senell [:Oh, yeah. I mean, this is just me thinking about and I don't if this is a stupid question, I apologize. But, is there like, this mind this shift in mindset around since this attack surface has gotten so broad and so big, almost uncontrollable, that it's almost like, we're less concerned about keeping people out is that we are just like if they get in, they get in, but they're never getting out kind of a concept.
Carolyn Ford [:Oh, wait. We're gonna be Hotel California? Seriously?
JR Williamson [:Oh, yeah. No. Yeah. Okay.
Mark Senell [:You can check out anytime.
JR Williamson [:if it just can never leave.
Carolyn Ford [:Ever leave. Yeah. Go ahead. Come on in.
Mark Senell [:Does that help?
JR Williamson [:Yeah. I think there's a lot of extreme models there. Yeah. I think from my point of view, it's not giving up because, you know, it's so hard and the attack surface has got more difficult. But I think it's about really zeroing in on the fact that we now have to, really persist these protections with the data itself. So it's less about, you know, the network. It's that's what we used to think about a lot. It's less about the server.
JR Williamson [:We used to think a lot about that. It's less about the application. We used to think a lot about that, and now it's the data because the data used to only be assessable via the application business logic, via the server that ran it, via the network, you know, that connected to it. And now that's all changed, and this data can persist in many different ways, particularly when it's mashed up. You know? So I'm doing SQL queries. I've got this information over here. I got this information here. We're trying to, you know, democratize access to the data, the citizen, developers and the citizen knowledge workers who need to use this data to make great business decisions for their functions were given this access, and now that data is whoosh.
JR Williamson [:It's all over the place. You know, how do I keep control of it? And so really getting good at understanding what that data is, what its use cases are, what it's appropriate for, and then putting governance around that element of information, which could lead to an insight, when mashed up, and analyzed, that's the concept behind the oil. You know, that the data is a new oil. And then ultimately, ensuring that those principles and properties around safety persist with the data regardless of where it is.
Mark Senell [:Oh, you have to you have to we have to talk about that, Carolyn, because I didn't hear that.
Carolyn Ford [:Data is the new oil?
Mark Senell [:Data is the new oil. I wanna understand your thoughts about what that means.
Carolyn Ford [:Yeah. So well and as before you I just wanna say, as you're talking, JR, I'm like, I feel like he's talking about securing air when you're because and, like, it's such the way you describe it and everything that you're trying to do is just massive. It's kinda breaking my head. So before we go all the way to the data is the new oil, because that's what I wanna get there too. I wanna know how you even approach this problem in, I mean, Northrop Grumman, Leidos, they're two really big organizations. And how do you even where do you even begin to lead the charge, of of protecting your data.
JR Williamson [:Well, I think first and foremost, it is a big problem. Let's just agree to that. It is a hard issue. But second is you gotta find what's important. I mean and not everything that you do is as important as the next thing. And so really zeroing in. Sometimes you hear this term crown jewels. I think it's a little overused, but the concept is pretty easy to get your head around.
JR Williamson [:You know, that there are some things in your organization we do that either drive competitive advantage, to the corporation that if lost or altered beyond your intention could cause harm, to your business financially, to your shareholders, or it's customer data that you have that's super important that if lost it could harm the customer mission and what they're, intending to do. And third, sometimes I'm working, of course, with partners in my supply chain, in my ecosystem here. And sometimes I have their sensitive data, that's really important to them that I'm now a steward of because I use that in some meaningful way to produce some outcome for our joint customer. Whether I'm a prime contractor or a sub, you know, wherever I am in that supply chain, you know, protecting that sensitive information is also important. So knowing what those things are, is the first step. And, yes, it takes effort, to to go do that. That's a lot of conversations with a lot of different people and data owners to really figure out what's the most important. And, yes, there is a regulatory landscape with some of this, know, when you're working in the government contracting world, and so some of that is very prescriptive, but other parts of it are not.
JR Williamson [:And you really need to develop an effective risk understanding and a model around managing that risk to ensuring the protection of that data. You know, look. I'm an engineer. Engineers, we make up words. I made up a word. I call it risktasity. And risktasity is this word that really talks about and describes the elasticity of rigor based on risk. And so the concept simple, when risk is high, rigor should be high.
JR Williamson [:But when risk is low, rigor should be low. Because rigor creates friction. And friction is a speed problem.
Carolyn Ford [:Oh my gosh! I can't. Mark and I both worked for Raytheon. And as you're talking, I'm having a little bit of, you know, flashback. I don't wanna go as far as PTSD, but maybe a little bit. Like, the whole classification problem.
JR Williamson [:Oh, sure.
Carolyn Ford [:You know? But I wish I would have had this philosophy. I would have put it up on my wall, risktasity. So rigor based on risk. So the higher the risk, it would have really helped me.
Mark Senell [:This is a fabulous term.
Carolyn Ford[:I love it.
Mark Senell [:The concept to me seems simple. It seems common sense. It's but I think sometimes we ever think common.
Carolyn Ford [:I needed that. I needed that up on my wall as I'm trying to think, okay, do I need to, like, lock this up. Do I need to burn it? Do I need to burn that part of my brain because I saw it? You know? So the first thing the first thing your first approach is to know what's really important. I love this risktasity part of it, to know where it is and then to implement a model to protect it.
JR Williamson [:That's right. And protect it wherever it needs to be. so part of understanding, the data itself is the data's life cycle. How does it get created? How does it get managed? How does it evolve? What is its life cycle cradle to grave? Who needs access to it? And when they need access to it, where do they need access to it? And does it you know, it's part of its evolution. Does it get transformed? Sometimes back to the risktasity model, The data may enter the content life cycle here at some level, but then over its evolution may raise up higher. As it gets raised up, your policy and your governance of safeguarding that data must also go with it. And so that's a really difficult thing to do, but that's that's the job. That's the concept.
Mark Senell [:Is this conceptual?
JR Williamson [:We're trying to do.
Mark Senell [:JR, is this conceptual for you, or is this, actually being applied?
JR Williamson [:Oh, it's both. Yeah. It starts with the concept, but it has to be fused. Like, all good governance, governance doesn't work unless it's baked into the quality of your daily work. So you have to start with that sort of conceptual idea, that strategy, around data understanding, and then and then come up with an effective data protection model that implements the strategy. I mean but I think we could agree structure follow strategy and not vice versa. Once I understand where I'm trying to go, now I organize, around it, and that's the structure that you have to put in around data protection.
Mark Senell [:Right.
Carolyn Ford [:Did you help agencies with their strategies and implementation?
JR Williamson [:We do. In fact, certainly, from our Northup days and definitely at Leidos, you know, sort of largest IT provider, you know, to the federal government. You know, we work this on many levels, you know, for our customers. In this sort of way of thinking about the data is really important, but then operationalizing it in, you know, procedurally to actually take the concept of risktasity at the end, baking it into that daily work that we do is really the secret sauce.
Carolyn Ford [:What was the biggest roadblock? What's one of the biggest roadblocks that you've seen as you're helping agencies with their strategy an implementation. And if you wanna mention one of the agencies that was the biggest pain in the ass.
JR Williamson [:I won't mention any of my super important customers, but what what I would say is that their existing structure, gets in the way of strategy. So a lot of times people get it conceptually pretty easily, but then it's like, I'll never be successful implementing that because of this structure, this structure, this governance model, this policy, you know, this person in the way that they think about the problem.
Mark Senell [:So exact modernization step that needs to take place, with some customers first before they can kind of take this approach.
JR Williamson [:It's difficult. And I mean and not to be pejorative about it, but, you know, the the journey to cloud was very similar. I mean, it was really difficult to sort of decouple, you know, just years of learning that we built into our cyber defensive, posture, you know, for on-prem capabilities. And now to pull that apart and to put it onto somebody else's network that you don't control or onto somebody else's server, that you don't control, that was a difficult problem. And so the structure of how we did things to keep things safe became the biggest inhibitor to adopting a new business operating model. And similarly, we're seeing with our customers because a whole lot of legacy, here. It's not their fault.
JR Williamson [:In fact, sometimes you could argue it's our fault. We're the government contractors. We built a lot of this for them, to meet their mission outcomes, you know, over the decades. And now that has to be sorta bimodally redesigned, redeployed, and then brought in. And then the old system sort of just retired, in place over time because it is almost impossible just to do it piecemeal, you know, 1 by 1, to try to transform.
Mark Senell [:JR when the pandemic hit. So we've been we've been dealing with customers for a decade trying to modernize, trying to transform, you know, and the pandemic hits, and lo and behold, boom. Agencies moved at light speed to the cloud. They moved, to support, you know, work from home. Okay. What happened there? I mean, for took a decade.
JR Williamson [:Lots of cyber problems happen there. Yeah.
Mark Senell [:So that opened up, I guess, a can of worms in a lot of other areas, I assume.
JR Williamson [:Yeah. Yeah. For sure.
Carolyn Ford [:Alright. Well, let's talk about data being the new oil. And I will say, you know, the first time I ran into that was Paul Shari's or Scharre , sorry, book Four Battlegrounds, and he's an AI expert. And I read that and I was like, this interesting. And he also positions that whoever owns the data is going to win the AI race. So I don't I mean, that may have taken us down the AI road a little too soon because I wanna talk about data being the new oil in context of zero trust and cybersecurity. So how does this mindset around data impact how agencies should be protecting their data and about how they're using it?
JR Williamson [:Yeah. All great questions. and I think there's a couple of different concepts out there, around, you know, data is the new oil. And so I'll sort of define it from JR's point of view, and then I'll talk about why it's not just the new oil because there's another evolutionary element to this, I think. So first, data as a new oil is really just here to describe that ultimately and we've heard this whole story, you know, about data turning into information, information turning into knowledge, knowledge producing insights. Why is that important? Because we're here to make choices. We're here to make decisions and almost every application that exists on the planet that is not just transactionally based is here for decision support.
JR Williamson [:And the idea is computers augmenting humans making choices, making good decisions. And when you put that into the mission context, you know where that sensor to shooter, that's fight or flee, you know, that's confidence intervals, around choices that we make, within the mission parameters. All of that is about getting to insights. What does the data actually mean? So the data in and of itself is really not that valuable. Just like oil in and of itself is not that valuable. But what that oil can be transformed into is what's really important, and that's really the concept. You know, the data is sort of that new to oil. So now what are we doing, you know, as IT providers and as data and an analytics type of providers doing to transform the oil or to transform the data.
JR Williamson [:What are we turning it into? How quickly and how effectively and how responsibly are we able to convert that data into insight and to get that insight into the hands of a decision-maker, to make a choice. Now the decision-maker in the future may not even be a human. It may be a machine, you know, back to the sort of augmented intelligence. Artificial intelligence is not a thing I'm super excited about as a term, but I like this idea of how humans partner with machines more effectively and so I think of that more as augmented intelligence. And then putting that into the hands of that decision and maker whether that decision maker is software with an appropriate and responsible and ethical constraints or a human, But that's the transformation. So we're transforming that data into something that is useful. And by useful, I really mean decisionable.
Carolyn Ford [:So, I wanna okay. So this way, I'd like to talk to people a lot smarter than me. Because what you just said about data...
JR Williamson [:We better get somebody else on the call then.
Carolyn Ford [:Well, okay. But you made me think about so when I hear the word data, I immediately jump to the insight. I just think of it that way. But If we think about your idea of risk chastity, maybe it's not all data that we're protecting. Really what we're protecting is when we move up this chain that you just were protecting the analysis and the insights. Or is it both, but just at a varying degree? Like, when you put your security controls in place, it doesn't all have to be as tightly controlled.
JR Williamson [:Well, in in and where does it need to be controlled? You know, because, ultimately, you're here trying to drive some outcome. And we're really trying to use it. A whole data-driven enterprise strategy is that it's not just humans with their own personal insights and personal experiences. Those are great. You know, sometimes they're invaluable, particularly when you're in the field, you know, on point, on mission, trying to solve a problem. But we know that we make better decisions as humans when we have better data. And ultimately, the better data can lead to a transformation of the data into that insight. And that's that knowledge management somewhat intractable sort of journey it feels at times, but that's really what we want.
JR Williamson [:And that's competitive advantage, you know, because when you're on a decision point and you're running against the clock, then whoever can come up with the insight first is the all ultimately the one who's likely to win. and so the insight and I should say and this where the some people think about data is the oil and saying, well, is it really? Because getting the insight in and of itself is important, but combining that insight with understanding of the problem we're trying to solve is really where the the competitive advantage comes into play. Because sometimes somebody could give you the answer, and the answer could be absolutely valid. But if you don't know how to apply the answer, is that really effective? And so there are other parameters, you know, to think about this too. But first and foremost, we need to mine the data, finding the data, understanding the data, transforming the data into an insight and then applying the insight to the business outcome that we need. And when you can do that, you can do that with speed. You can do that with efficacy, you can do that with responsibility, and ethical practices. That's a win.
Carolyn Ford [:Well, and you're talking about innovation. Right?
JR Williamson [:Well, innovation may be of the processes that underlie how you go do those transformative things. But what I'm really talking about is getting the data itself into a form, into a shape that can actually be actionable, you know, for whatever your mission outcome is.
Mark Senell [:So we've kinda talked a little bit about this, you know, over the last few minutes, augmented intelligence, AI. It seems to me it's so topical in conversations, today that it's that it is the it's the new new thing. How do you see AI transforming or impacting cybersecurity data protection, data transformation, as we move forward.
JR Williamson [:Well, I guess I'd say a couple of things about it. You know, first of all, it's scary, because, you know, machines can move with speed, that we cannot, and that is a concern. I mean, imagine, you know, watching an episode of The Flash. You know, what is the most scary about The Flash? You know, it's the fact that I have a plan. You know, I'm using my skills. I'm using my understanding. I'm using my competencies to accomplish something, but the Flash could do it so much faster than me. I'll never win.
JR Williamson [:You know, the Flash can always outrun me, outperform me, and that's an issue with our competition. And when you think about great power competition, That's a concern. You know? And so in the competitive market space, who's first to market tends to, on average, succeed, you know, here, in great power competition issues.
Mark Senell [:Very good.
JR Williamson [:You know, he who's capable of both defending, effectively, as well as projecting, you know, force when required is the one who's most likely to succeed. And so that becomes a concern around speed. These tools now have the ability to speed things up. So if you're in a defensive mode, you're now playing catch up. And you're constantly playing whack a mole, and you're always behind. You're always behind, which means they're able to attack at will, and you're always playing catch up. And that is a terrible way to spend your time. So trying to get ahead of that curve means you're also using the same tools.
JR Williamson [:Now I feel like we're in an arms race, again, you know, with AI, because if the adversary is using these tools to effectively find ways to penetrate, then I need to use these tools to find those ways that the adversary can penetrate and close those things up before they get the opportunity, to exploit it. So just like we go back to those nineties, you know, with antivirus and, you know, we had the bad guys out here trying to develop ways to break in, and find novel ways to exploit, and we were out here trying to develop tools to discover their techniques very quickly. And so we're in this little race, between how quickly can I close a hole before they can find it? Now that has been exponentially, you know, sort of expressed in today's technology.
Mark Senell [:And there's no there's no guidance or governance or watch dog around this technology. You know?
JR Williamson [:We're getting there. I agree. I mean, this it's funny. We've been working in the technology space on this evolution of advanced analytics and then eventually to learning, you know, and learning machines and bringing, more capable, learning machines, together over time and different techniques and methods of doing this. So we know this coming. We've seen the movie. You know, Skynet rises and wipes out all the humans. So it's not like we're completely unaware. But being able to do this and do this effectively seems like it's just dropped on us, you know, here really quickly, you know, with OpenAI and ChatGPT, and there were models that were out there before.
JR Williamson [:But the really sort of, you know, human adoption of this tool and it's, just sort of prevalence, you know, out in the, in the Internet, I think has really, really brought forward this. and this concept of large language models is also not new, but it's one that's really come forward now. And the fact that it isn't the Rosie the Robot, you know, from the George Jetson days of doing that broken sort of, computer sort of speak to now something that can readily replicate any of us, you know, at any moment and speak to us in a language that we can easily understand translate and then to maintain context of it. I'm not talking to the computer the way that I used to talk to the computer. Now it feels like I'm interacting with another human.
Carolyn Ford [:I mean, the usability. The usability for the masses is what's what it's all about. I mean, anybody can access those large language models that generative AI now. I mean, I can type in a question. It still isn't quite I mean, I have my issues with it, but, anyway well.
JR Williamson [:And it is still nascent. You know? So what this will look like 5 years from now, 10 years now, 15 years from now will be very different, but it's still pretty damn phenomenal. When you think about the advance that we advancement we've had here in just the last couple of years. And imagine that speed of transformation, how that grows along Moore's law. And I know Moore's law typically thought of in terms of of hardware. but when we think about this advancement and understanding because it builds on top of itself every time, it kinda blows your mind a little bit.
Carolyn Ford [:Yeah. So one of the things that surprised me at the Billington panel, everybody on the panel agreed that zero trust is a horrible term. And I actually had never heard people like
JR Williamson [:I hate that term.
Carolyn Ford [:Yeah. So I wanna know, you do hate it. So tell me why and, like, what's a better term?
JR Williamson [:Well, so I actually like the term earned trust or managed trust, better. Because, you know, all interactions require some sharing of information. And if I have zero trust, I'm not sharing anything. It's a zero.
Carolyn Ford [:Good point.
JR Williamson [:You know, something times zero is always oh, yeah, zero. So the goal here is really not to have zero trust. The goal is to have earned trust and then to manage that trust. So what am I bringing to the table to earn your trust? So that at the end of that moment, we can have some agreement to share some information with each other. And sometimes that information be really low in terms of value, so your trust factors may not be very high. I'm happy to share that information with you. I have no idea who you are. but I'm okay because this information is really not that important.
JR Williamson [:Versus something that may be much more sensitive, and I may require more things from you in order to build or to earn that trust so that I'm willing to share this information with you. So that's really the concept I think of, of earn trust or manage trust. Not a big fan of zero trust. I'll also say zero trust is not a new concept. You know, we've been doing this for forever. But like cloud, we were doing application hosting, you know, not in our own data centers for years, and then we called it the cloud. Similarly here, zero trust has taken on sort of its own marketing, you know, kind of center of gravity.
JR Williamson [:But the concept is a good one, and that is that at the end of the day, you just don't provide ubiquitous access to things. You actually have to build access and earn the access based on who you are, where you are, what you're accessing, from, and what information is appropriate relevant for your mission and your role.
Carolyn Ford [:So would you say that zero trust then really does boil down to identity and access management?
JR Williamson [:It is about identity and access management. Absolutely. But I would just definitely say that it's not just identity and access management because it's not enough that I know who you are and I have assurance in who you, that's really important stuff. But what you're accessing this information from is also important because safeguarding the data, also has a a parameter that we have to manage around where the data actually is. So you accessing the data from an unmanaged system or a system that does not got the right controls on it in order to appropriately safeguard that data is also important. So, yes, I know it's you. And you are authorized for this data, but you're trying to access it from a device that I cannot trust and is therefore not fit for use for actually storing, processing, or transiting that data. So I also need to detect that as part of this whole process, you know, of agreeing that it's okay for you to access that data. So where you do it, how you do it, when you do it, what you do it from, is equally important as who are you.
Carolyn Ford [:Yeah. And I mean, thinking back to earlier what you said, to achieve this zero trust architecture or earned trust architecture, let's change the world, JR.
JR Williamson [:Let's do it.
Carolyn Ford [:You have to start with, like, knowing what you've got, knowing what's important. So there's all of this, like, foundational work that has to happen first. And then this access piece is like, okay. Here's the key to the door. But everything else has already been built.
JR Williamson [:Right. And the access at the end should be the easiest part.
Carolyn Ford [:There you go.
JR Williamson [:Once you've done all this preparation, it should be the easy part. Like you mentioned earlier, Carolyn, around, you know, your time at Raytheon and dealing with data classification. You know, this idea, it's blocking and tackling, and it is not sexy, but it is essential to understand what you have. And if you don't understand what you have, you have no idea, what's valuable to the adversary, why would they want it, what's valuable to your customer, why do they want it. So this is really important stuff to focus your time on, to ensure that you build that in, and then that sort of conceptual governance now gets inculcated into your running processes.
Carolyn Ford [:That's fantastic. Think about things a little differently. Like I said, when I hear data, I immediately was jumping to the the insight part of it and the application of it, and there's so much in between. But my favorite thing today is the risktasity. Keeping that keeping that one. How about you, Mark?
Mark Senell [:No. That's that's fantastic. I mean, in your role, JR, you have to deal with managing and driving change with customers and within your own organization, it's gotta be a huge challenge. I, you know, can't even understand that. How do you how do you approach that with your customers?
JR Williamson [:Well, I think I think first and foremost is just understanding their mission. You know, what do you do? Why do you do it? You know, why is that important, to you? And then just trying to figure out what problems we're trying to solve. I mean, when when you think about your mission now put it in the context of playing tennis. So when I go to play tennis, I like to win. I'm an athlete. Athletes like to win. And so I'm out there and I'm playing against an opponent. And generally speaking, I know what my weapons are.
JR Williamson [:You know, my weapons are my ability to hit a forehand cross quarter down the line with topspin, with power, keeping the ball deep and keeping the opponent back because if they get closer to the net, it's gonna make it harder for me, to be successful in my attack. So I know what my strengths are. My question is, what are their strengths? And what are their capabilities? And can I bring my strengths to play in order to beat the opponent? And as I do this from match to match and match, I start to realize, oh, crap. I'm not so good at that. I really need to improve my backhand because the adversary has realized JR has got a good forehand. Don't hit the ball to his forehand. You know, and JR is good at hitting the ball strong from the baseline. Maybe if I slice the ball and chip it just over the net and force him to come to the net, I can take away his advantage.
JR Williamson [:So when I think about the mission outcome for the customer, what are the things that we're really good at? So we're on point, on mission. We're working within our upper and lower control limits. And then what are those areas where I'm not good at? So understanding where the problems are, how the adversary is looking at it, how the market is looking at it gives me the best chance of helping build a solution, not just thought leadership for the future. That's important to think about what's next. But then how can I actually turn the dials for them to get the mission outcomes that they need, when they need them?
Mark Senell [:Yeah. How are you defining where the problems are that you don't know about?
JR Williamson [:You have to understand your customer's mission. I mean, if you don't have intimacy and understanding your customer mission and what's important to them, you're already at a disadvantage. So how do you do that? First of all, you gotta build some trust with them, there's a reason why they're having a conversation with you. Otherwise, they're not gonna spend their time, you know, treasure and talent, on you. So building that trust that, hey, I can add value, you know, to this conversation. And then, building that through experience, building that through asking questions, building that through joint experimentation. Carolyn, you mentioned earlier innovation.
JR Williamson [:You know, this concept of co-innovation is really important. You know? So whether that's a C-RAD, a contract R&D thing that we do together, a CRADA that we put together, but we can work on trying to solve these problems. And then the last thing I'd say is past performance. You know, I have solved this problem before, and I've got, you know, examples to where I've solved this problem. And I can bring those stories to the customer to give them confidence that I understand the problem, I understand their mission, and working together, we can solve the issues that they have.
Carolyn Ford [:Yeah. You're really able to, like, zoom out, see the forest. You're not missing the forest for the trees. Zoom out, see what their mission is, and then say, okay. Here's how we're gonna, like, move forward. So okay. We're gonna move to our Tech Talk questions. So these are just quick fun questions, JR.
JR Williamson [:Oh, boy.
Mark Senell [:What about any new TV shows or movies or books or podcasts, anything like that?
JR Williamson [:All of those. All of those are great. Well, one, I'll just, talk a little bit about, The Expanse. Have you watched the show The Expanse?
Carolyn Ford [:So I started it, and I watched an episode, and it really scared me. And so I stopped watching it, but so tell me more. Should I start again?
JR Williamson [:So what I love, again, I'm a sci-fi buff, probably not a big surprise being an engineer, but, you know, I like space and science and technology. And one of the things that's really interesting about The Expanse, it's got like a little cult following here is that it's it gets the science right. You know, who who doesn't love Star Wars? I'm sure you guys have all seen all the Star War shows, and the movies that are out there. I mean, there's just whole ginormous, empire around Star Wars, and I love them, and they're fun, and I always watch them. I've watched them all so many times, but the science is all wrong. You know? Luke and Vader, you know, avoiding themselves, you know, flying around, trying to get a lock on this spaceship. It's in zero gravity for crying out loud.
Carolyn Ford [:JR, you're stepping on dangerous ground here.
Mark Senell [:Trek or Star Wars?
Carolyn Ford [:Star Trek. Well, also I mean, Star Wars. Both.
JR Williamson [:It's both. The same problem exists in in in Star Trek. Although Star Trek did a little bit better with it, but it's the same concept. You know, the whole idea that you're on this bend, you know, is you're flying through space. You know, it doesn't work that way. And once you get going in in in a vacuum, you don't stop. You know? So to turn around, you don't do this really big banking thing. You literally have to fire your thrusters to stop your momentum, flip turn, and go in a different direction.
Carolyn Ford [:But you're saying The Expanse gets it right.
JR Williamson [:Yes. The Expanse gets it right.
Carolyn Ford [:Are you looking forward to the new the second Dune?
JR Williamson [:I am. I'm a big, big fan of Dune. I read all the way up through the 3rd novel, and then quite frankly, it just got too weird, the the religion, you know, got too weird.
Carolyn Ford [:You got the third before it got too weird.
JR Williamson [:Yeah. Yeah. No. I thought I thought, you know, the 1st and second, were really great, and the 3rd was good, but the old god emperor thing got got a little bizarre, and then religion just made a turn, that I wasn't a fan of. But, you know, love Dune and I've seen every Dune movie that's come out too, and I think this series is really good. I think they're being true to the book, I think the acting is really good. I think some of these esoteric things about how do you fold space, you know, and how the guild uses the spice. I think they're doing a good job making that more real.
Carolyn Ford [:We gotta let JR go. JR, we need to have drinks because we could geek out over, like, this stuff, the sci fi stuff. For hours, I could talk to you. But thank you.
JR Williamson [:I love it, and, I'm always happy to talk about any of these topics. You know, IT and cyber are fantastic. They are certainly a passion of mine and of all the things I've got to do in my career, I mean, I never wanted to be really, an information security puke. I've never been a NO kind of guy. I'm more of a k n o w kind of guy. I like solving problems. And so I didn't wanna be part of that all they did is told people, no.
JR Williamson [:You can't do that. But, you know, when I did finally get my arm twisted and got into this, I thought this was the greatest thing ever, and what a fantastic mission. And this gives us the opportunity to manage risk back to risktasity to solve problems in a safe way. And it's a fantastic mission. I have a great deal of passion for it.
Mark Senell [:Well, you've been great. This has been a lot of fun, JR. I really appreciate your time.
Carolyn Ford [:Yeah. Thank you so much. And thank you listeners. Make sure you share and smash that like button. We'll talk to you next week on Tech Transforms. Thanks for joining Tech Transforms sponsored by Dynatrace. For more Tech Transforms, follow us on LinkedIn, Twitter, and Instagram.