Episode 83

So What?: Understanding Disinformation and Election Integrity with Hillary Coover

Can you spot a deepfake? Will AI impact the election? What can we do individually to improve election security? Hillary Coover, one of the hosts of the It’s 5:05! Podcast, and Tracy Bannon join for another So What? episode of Tech Transforms to talk about all things election security. Listen in as the trio discusses cybersecurity stress tests, social engineering, combatting disinformation and much more.

Key Topics

  • 04:21 Preconceived notions make it harder to fake.
  • 06:25 AI exacerbates spread of misinformation in elections.
  • 11:01 Be cautious and verify information from sources.
  • 14:35 Receiving suspicious text messages on multiple phones.
  • 18:14 Simulation exercises help plan for potential scenarios.
  • 19:39 Various types of tests and simulations explained.
  • 23:21 Deliberate disinformation aims to falsify; consider motivation.
  • 27:44 India election, deepfakes, many parties, discerning reality.
  • 32:04 Seeking out info, voting in person important.
  • 34:18 Honest cybersecurity news from trusted source.
  • 38:33 Addressing bias in AI models, historic nuance overlooked.
  • 39:24 Consider understanding biased election information from generative AI.

Navigating the Disinformation Quagmire

Dissecting Misinformation and Disinformation

Hillary Coover brings attention to the pivotal distinction between misinformation and disinformation. Misinformation is the spread of false information without ill intent, often stemming from misunderstandings or mistakes. On the other hand, disinformation is a more insidious tactic involving the intentional fabrication and propagation of false information, aimed at deceiving the public. Hillary emphasizes that recognizing these differences is vital in order to effectively identify and combat these issues. She also warns about the role of external national entities that try to amplify societal divisions by manipulating online conversations to serve their own geopolitical aims.

Understanding Disinformation and Misinformation: "Disinformation is is a deliberate attempt to falsify information, whereas misinformation is a little different." — Hillary Coover

The Challenges of Policing Social Media Content

The episode dives into the complexities of managing content on social media platforms, where Tracy Bannon and Hillary discuss the delicate balance required to combat harmful content without infringing on freedom of speech or accidentally suppressing valuable discourse. As part of this discussion, they mention their intention to revisit and discuss the book "Ministry of the Future," which explores related themes. Suggesting that this novel offers insights that could prove valuable in understanding the intricate challenges of regulating social media. There is a shared concern about the potential for an overly robust censorship approach to hinder the dissemination of truth as much as it limits the spread of falsehoods.

The Erosion of Face-to-Face Political Dialogue

The conversation transitions to the broader societal implications of digital dependency. Specifically addressing how the diminishment of community engagement has led individuals to increasingly source news and discourse from digital platforms. This shift towards isolationistic tendencies, amplified by the creation of digital echo chambers, results in a decline of in-person political discussions. As a result, there is growing apprehension about the future of political discourse and community bonds, with Hillary and Tracy reflecting on the contemporary rarity of open, face-to-face political conversations that generations past traditionally engaged in.

The Shadow of Foreign Influence and Election Integrity

Challenges in India’s Multiparty Electoral System

In the course of the discussion, the complexity of India's electoral system, with its multitude of political parties, is presented as an example that underlines the difficulty in verifying information. The expansive and diversified political landscape poses a formidable challenge in maintaining the sanctity of the electoral process. The capability of AI to produce deepfakes further amplifies the risks associated with distinguishing genuine content from fabricated misinformation. The podcast conversation indicates that voters, particularly in less urbanized areas with lower digital literacy levels, are especially vulnerable to deceptive content. This magnifies the potential for foreign entities to successfully disseminate propaganda and influence election outcomes.

Election Integrity and AI: "Misinformation and disinformation, they're not new. The spread of that is certainly not new in the context of elections. But the AI technology is exacerbating the problem, and and we as a society are not keeping up with our adversaries and social media manipulation. Phishing and social engineering attacks enhanced by AI technologies are really, really stressing stressing the system and stressing the election integrity." — Hillary Coover

Countering Foreign Disinformation Campaigns in the Digital Age

With a focus on the discreet yet potent role of foreign intervention in shaping narratives, Hillary spotlights an insidious aspect of contemporary political warfare, the exploitation of media and digital platforms to sway public perception. This influence is not just limited to overt propaganda but extends to subtler forms of manipulation that seed doubt and discord among the electorate. As the podcast discussion suggests, the consequences of such foreign-backed campaigns could be significant, leading to polarization and undermining the foundational principles of democratic debate and decision-making. The potential for these campaigns to carry a vengeful weight in political discourse warrants vigilance and proactive measures to defend against such incursions into informational autonomy.

Addressing the Impact of Disinformation Through AI's Historical Representation Bias

Tackling Disinformation: AI Bias and the Misrepresentation of Historical Figures

The discussion on AI bias steers toward concrete instances where AI struggles, as Tracy brings forth examples that illustrate the inaccuracies that can arise when AI models generate historical figures. Tracy references a recent episode where Google's Gemini model was taken offline after it incorrectly generated images of German soldiers from World War 2 that did not match historical records. Similar errors occurred when the AI produced images of America's Founding Fathers that featured individuals of different racial backgrounds that did not reflect the true historical figures. These errors are attributed not to malicious intent by data scientists but to the data corpus used in training these models. This segment underscores the significant issues that can result from AI systems when they misinterpret or fail to account for historical contexts.

The Necessity of Addressing AI Bias

Continuing the conversation, Hillary emphasizes the importance of recognizing and addressing the biases in AI. She advocates for the vital need to understand historical nuances to circumvent such AI missteps. Both Hillary and Tracy discuss how biased news and misinformation can influence public opinion and election outcomes. This brings to light the critical role historical accuracy plays in the dissemination of information. They point out that to prevent biased AI-generated data from misleading the public, a combination of historical education and conscious efforts to identify and address these biases is necessary. The recognition of potential AI bias leads to a deeper discussion about ensuring information accuracy. Particularly with regard to historical facts that could sway voter perception during elections. Tracy and Hillary suggest that addressing these challenges is not just a technological issue but also an educational one. Where society must be taught to critically evaluate AI-generated content.

The Challenge of Community Scale Versus Online Influence

Combating Disinformation: The Struggle to Scale Community Engagement Versus Digital Platforms' Reach

The dialogue acknowledges the difficulty of scaling community engagement in the shadow of digital platforms' expansive reach. Hillary and Tracy delve into the traditional benefits of personal interactions within local communities, which often contribute to more nuanced and direct exchange of ideas. They compare this to the convenience and immediacy of online platforms, which, while enabling widespread dissemination of information, often lack the personal connection and accountability that face-to-face interactions foster. The challenge underscored is how to preserve the essence of community in an age where online presence has become overpowering and sometimes distancing.

Navigating the Truth in the Digital Age: “Don't get your news from social media. And then another way, like, I just do a gut check for myself. [...] I need to go validate." — Hillary Coover

Impact of Misinformation and Deepfakes on Political Discourse

The episode reiterates the disquieting ease with which political discourse can be manipulated through deepfakes and misinformation. Showcasing the capabilities of AI, Tracy recalls a deepfake scam involving fake professional meetings which led to financial fraud. These examples underscore the potential for significant damage when such technology is applied maliciously. Hillary emphasizes the critical need to approach online information with a keen eye, pondering the origins and credibility of what is presented. Both Tracy and Hillary stress the importance of developing a defensive posture towards unsolicited information. As the blurring lines between authentic and engineered content could have severe repercussions for individual decisions and broader societal issues.

Stress Testing and Mitigating Disinformation in Election Security Strategies

The Role of Stress Tests in Election Security

Hillary and Tracy discuss the importance of conducting stress tests to preemptively identify and mitigate vulnerabilities within election systems. These tests, which include red teaming exercises and white hat hacking, are designed to replicate real-world attacks and assess the systems' responses under duress. By simulating different attack vectors, election officials can understand how their infrastructure holds up against various cybersecurity threats. This information can be used to make necessary improvements to enhance security. The goal of these stress tests is to identify weaknesses before they can be exploited by malicious actors. Thereby ensuring the integrity of the electoral process.

Mitigating the Impact of Disinformation

The conversation emphasizes the urgent need for preemptive measures against disinformation, which has grown more sophisticated with the advent of AI and deepfakes. As these technological advancements make discerning the truth increasingly difficult, it becomes even more crucial for election officials to prepare for the inevitable attempts at spreading falsehoods. Through stress tests that incorporate potential disinformation campaigns, officials can evaluate their preparedness and response strategies. Including public communication plans to counteract misinformation. By considering the psychological and social aspects of election interference, they aim to bolster defenses and ensure voters receive accurate information.

Election Security Concerns: "Other instances are going to happen where criminals are gonna be impersonating legitimate sources to try to suppress voters in that case, or steal credentials, spread malware." — Hillary Coover

Importance of Proactive Approaches to Election Safeguarding

The exchange between Tracy and Hillary reveals a clear consensus on the necessity of proactive strategies for protecting elections. Proactively identifying potential threats and securing electoral systems against known and hypothetical cyber attacks are central to defending democratic processes. By focusing on anticipation and mitigation, rather than simply responding to incidents after the fact, authorities can improve election security and reinforce public trust. This proactive stance is also crucial in dealing with the spread of disinformation, which may be specifically tailored to exploit localized vulnerabilities in the electoral infrastructure.

Reflecting on the Challenges of Election Security in the Digital Era

This episode serves as a thorough examination of the challenges posed by digital communication in modern democracies. They delve into the dangers of misinformation and the manipulation of public opinion, highlighting how biases in AI can affect the information that individuals receive. They underscore the importance of stress-testing election systems against digital threats and recognize the complexities inherent to securing contemporary elections. The episode ultimately helps listeners to better grasp the ever-evolving landscape of election security and the continued need for informed, strategic action to safeguard democratic processes.

About Our Guest

Hillary Coover is one of the hosts of It’s 5:05! Podcast, covering news from Washington, D.C. Hillary is a national security technology expert and accomplished sales leader currently leading product strategy at G2 Ops, Inc.

Episode Links

Transcript
Carolyn Ford [:

Hi. Thanks for joining us on Tech Transforms. I'm Carolyn Ford here with Trac Bannon. Hi, Trac.

Tracy Bannon [:

Hello. Hello.

Carolyn Ford [:

I'm so excited to talk to you today about this. We've already talked a little bit, and we're gonna dig in in just a second, on our on the latest So What episode. Today, we are welcoming Hillary Coover, who is a technologist and one of your fellow reporters contributing to It's 5:05! podcast, which is another plug here for 5:05, guys, it's a great, like, McDonald's sized reporting of what's going on in cybersecurity. I usually listen to the Fridays, but it's every day. Right?

Tracy Bannon [:

We've pulled back temporarily to Fridays. And so, so if folks wanna catch us on Friday, head to your happy hours while listening to point of views. It is 4 to 6 of us all giving a very different in, take on the same topic.

Carolyn Ford [:

Yeah. So when it's only like 10 to 15 minutes tops. And so then you can go dig in to topics that you are really crazy interested in, which is what we're going to do today, with Hillary, because Hillary's been reporting on election fraud. And, we're gonna focus on the latest news surrounding the election security and disinformation, which we've already been talking about. Well, let me let me just back up and say hi, Hillary. Welcome to Tech Transforms So What?

Hillary Coover [:

Hi. Thanks for having me.

Carolyn Ford [:

Yeah. Like, clearly, I'm excited to talk about this. And Trac already got me going. I mean, you saw how she can just get me worked up because I saw over the weekend this really beautiful video, Billy Joel Turn the Lights Back On video. If you guys haven't seen it, I'm going to put the link in the show notes. It is gorgeous and it takes you through the decades of Billy Joel's career, him singing this song, Turn the Lights Back On, and it's a, it's a video of him through the decades or so I thought. Trac, go ahead and tell him what you did to me.

Tracy Bannon [:

Well, I explained to you that there's too much consistency. There is no splicing together, and this a brand new song. So it's all AI. It is. And it's beautiful, and it's well done, and it's impressive.

Carolyn Ford [:

And we stopped talking to have Hillary go watch it for a second. Hillary, what like, if you had not if you hadn't known right from the get go that it was AI, what would you have thought you were watching?

Hillary Coover [:

I probably would have thought the same thing. I mean, because I went into it thinking, okay. I need to find the inconsistencies. I was looking for, you know, buggy eyes, funky eyebrows, lip syncing inconsistencies, you know, that sort of thing. But I knew going into it. So, it was hard to spot even knowing going into it.

Carolyn Ford [:

Yeah. And so it really drives home the point of what AI can do with this election cycle. I mean, it was already happening. You were reporting on it in 2016, Hillary. But I mean, for me, like, why would I not trust that video or believe that video was a splice together video of Billy Joel? I'm a Billy Joel fan. I didn't recognize the song, but I was I was like, it's just an obscure song that maybe I haven't heard before. And there is no way I would have ever considered that video AI until Trac, what you said.

Tracy Bannon [:

Well, think about why it got you. It's because you believed that there was a connection or consistency because you've been a fan, because you've seen so many times, may have gone to concerts. So you have your brain already has a whole set of preconceived notions, right, about this. That's that's what makes it harder to fake these things. But I just had a video played for me, and the person I was talking to live who is who's really amazing with all of these generative technologies immediately switched over and he was he showed the recording and showed him speaking German, then showed him speaking, Japanese, then showed him speaking in Obama's voice, then showed him speaking as but it actually re-fixed his lips. So I was looking for the telltale, you know, like when we were really little kids and you'd watch Godzilla and, like, the mouth would move. Yes. Right? You see that craziness? That's not happening anymore. So there's there's it's going to get even harder when you don't have a relationship with somebody.

Tracy Bannon [:

There is no whether it's a fan relationship or a real human relationship, it's gonna get even harder. Like, have you seen some of the influencers on TikTok and Instagram that are generated? There are some that are getting some pretty big footprints now.

Carolyn Ford [:

I've seen I saw the one, like the Tom Cruise one, but it was a couple years ago. It was very good. But as soon as I started watching, I was like, oh, that's a that's a deep fake it's a AI generated. That Billy Joel one though, man, you're right. Like, my brain wanted to believe, my brain knew knew what I was looking at and then you blew my mind again. So, Hillary, I wanna bring this around to, well, bring this into the presidential election and ask both of you, starting with you, Hillary, what are some of the top concerns that you're hearing discussed in terms of election security this year?

Hillary Coover [:

I mean, for the most part, there are a lot of concerns, as there are every election. Misinformation and disinformation are they're not new. The spread of that is certainly not new in the context of elections, but the AI technology is exacerbating the problem, and we as a society are not keeping up with our adversaries and social media manipulation, phishing and social engineering attacks enhanced by AI technologies are really, really stressing stressing the system and stressing the election integrity. So it's a tough situation. We are, I think CISA is doing a good job, as are other influencers in the space, of of getting campaigns out to inform the public on what's going on and what to look for and to be vigilant. Do I think they're doing it as well as they should as they could be? Maybe not. But, there's room for improvement for sure.

Tracy Bannon [:

So let me ask you a question about that.

Hillary Coover [:

Yeah.

Tracy Bannon [:

Because so I go for my morning walk with my husband. We walk a couple of miles, and we debate. Like, so you can imagine I start my morning just like ready at it. And we were talking about this. We're talking about how do you, police this? How do you analyze for it? How do you help? Because we saw the last election cycle. We could argue about what was misinformation and what wasn't. We could argue back and forth about lockdowns and what was real information and what was not because some of it was just decided by platforms. How do we avoid right? How do we actually solve this? Because saying everybody be aware.

Tracy Bannon [:

Well, I can tell you that Carolyn is very aware of cybersecurity. I've been scaring the Jesus out of her for a year or two years, so she's aware. And yet, what is she gonna do? What what what would she do? What the what's the average person do when the fake is getting so real? What are the other tells that you can look for other than trying to figure out if the hair is parted the wrong way?

Hillary Coover [:

I mean, don't get your news from social media. I'm sorry.

Carolyn Ford [:

What about what about the fake calls from Joe Biden?

Hillary Coover [:

Yeah. That was scary. and that's, you know I mean I mean, other other instances are going to happen where criminals are going to be impersonating legitimate sources to try to suppress voters in that case, or steal credentials, spread malware. I think considering the popularity of that news story, how how far it got out and how fast it got out, I mean, the damage was definitely done, but I think valuable lessons were learned for the, you know, the presidential election, because they said, okay. If this stuff is gonna happen, here is what we need. We need to expect it to happen at this point. And and, you know, like, let's let's come up with some some ways of addressing this even faster than we did the actual incident. Is it is the execution of it gonna be perfect? Maybe not.

Carolyn Ford [:

Well, and, you know, I thought if I were one of those people that received that call, oh, I sure would have flagged it as fake because that just doesn't make any sense for Joe Biden to call me and tell me not to vote. Now, as those words come out of my mouth, I don't know how good the social engineering was. I didn't hear or read the actual message that was supposedly sent. And so I don't know. Like, maybe if I had taken that call, I it would have been convincing enough that I'm like, okay. There's some logic here that I'm like, okay. Yeah. I see.

Carolyn Ford [:

I don't did either of you actually hear the call?

Tracy Bannon [:

I did not.

Hillary Coover [:

I did not. But I also don't answer unknown I don't either. Unknown unknown numbers. So I probably wouldn't have been.

Tracy Bannon [:

Right. If you're not in my contact list Same. But it's and even on my business phone, my separate phone, it's dicey. I have to know area codes. And even then, when I get them wrong, I'm pretty angry with people.

Hillary Coover [:

Yeah. Same. So I but I think to

Tracy Bannon [:

voice mail, and I'll call you back.

Carolyn Ford [:

Yeah. But I think, like, one of the ways, Trac, to your point, and I love what you just said, Hillary, don't get your news from social media. And then another way, like, I just do a gut check for myself. Like I said, if I would have gotten that call somehow, I would have been like, this makes no sense. I need to go validate. And we're also busy that this garbage comes at us and we get in this reactionary mode where this stuff can get us. And it's like, how do we know what to trust? I mean, how do we even know? Trac blew my mind again before we started recording when I said, at least I know that I'm talking to the real Hillary right now, and I'm talking to the real Trace right now over Zoom. Like, I have Where are you? Exactly.

Carolyn Ford [:

So I didn't I didn't know this was possible. Trac, let's go ahead and talk about what you did to me the second time this morning.

Tracy Bannon [:

People are gonna listen to this episode, and they're gonna think that I'm, like, this malicious kid pulling the wings off butterflies or something. No. I shared with you that there had recently been, a scam that involved deep fakes. And from and, you know, we can post the article, so folks can see that in the in the comments. But, essentially, they had multiple professionals that showed up for this meeting, and only one of them was human. But they were named correctly. They were carrying out their roles and responsibilities.

Tracy Bannon [:

And what they were able to do was to have 1,000,000 of dollars moved moved to different accounts that they can't get back anymore. So, you know, you're in there with your CTO, your CFO. You're there talking strategy, and you're the junior person, but it all seems reasonable. And it's not just one meeting. It's a series of meetings. And through that series of meetings, you eventually take the action that you've been asked to do and Shazam, bang. Money gone.

Carolyn Ford [:

I still can't believe that. Like, I gotta go read this story in-depth because 200,000,000 talking to people you think you're talking to over Zoom, and you I it.

Tracy Bannon [:

So It looks like it was on it was 200,000,000 in Hong Kong dollars. It was 26,000,000.

Carolyn Ford [:

Oh, only 26. That's too old.

Tracy Bannon [:

Only 26,000,000. That's too old. Cool. That's crazy. So that's so that's real extreme. That those that's high stakes. That's high stakes misinformation, disinformation, high stakes scamming. Think about how this getting, you know, down into small potato things where you can't trust anything that you're doing anymore.

Tracy Bannon [:

I get text messages occasionally on my personal phone, for something that's being shipped to me. And it'll say, here's the here's the FedEx, and it's on its way. Now if it gives me a link and says, would you like to track this? I'm like, not gonna click on that link. I recently did just within the last 3 days, had won a text a text message that said, this USPS. We don't have your complete address. Could you log in and give us the rest of your address? I'm like, really? Seriously? And all I had to do was look at the credentials of what was getting sent to me. Like, the number it was sent from was, like, at Gmail. It's like benjisomebody@gmail.com.

Carolyn Ford [:

After they're trying to get you to put in username and password, maybe even more personal information. That's what they're after. I've never clicked on one of those either. I get them all the time. The regular

Tracy Bannon [:

But you're starting to get text messages, random text messages. Yeah. I got one that was masquerading as though it was executive leadership in the company that I work for, and it was believable. What he was asking me to do was believable up until that third question, because I'm always like, why would he why would he text me? And then it dawned on me, I have 2 phones. Then both of my phones use the same iCloud account, the same Apple account. So I get text messages on both phones. The key to the key for me figuring it out was it was only popping up on my personal phone, and that person would have never had my personal phone number, not on my but think about it. So we're In the time sending out robo robo emails, robo calls, robo text messages.

Tracy Bannon [:

Have you gotten robo text messages about the elections? Mhmm.

Carolyn Ford [:

I got them from Nikki Haley has been texting me.

Tracy Bannon [:

Really? Well, that's interesting. I would expect her to text you.

Hillary Coover [:

Yeah. I was just gonna say in the context of elections, I get I get them all the time to the point where I, you know and they try to get you riled up about an issue that they anticipate based on your demographic, your voting registration status, and all that. and they try to get you riled up, so you wanna click on something to be like, you know, yes. I will sign a petition for something. But, if you yeah. Like, you open it up and you look and you're like, it's benjie@gmail, some some other, you know, scammy looking hyperlink or short link. So, I never click on them. And every time I get one, I screenshot it, and I send it to my mom.

Hillary Coover [:

And I say, mom, don't click on that.

Carolyn Ford [:

Well, do you know what do you know what gets me riled up? Is I don't care who it is. Well, maybe Obama. He can text me. Probably Jane Goodall. But anybody else, I'm like, did I give you my number and tell you could text me? I don't care who you are. Don't be sending me texts. If they that's what gets me riled up.

Hillary Coover [:

Feel the same way. Even legitimate ones. I think they're all soon because of that.

Tracy Bannon [:

There was no consent. It's stepping over a line. It's stepping over a line when it comes to text messages because text messages are the most personal. And if you're going to use it for any kind of business context, I want them to ask me first.

Hillary Coover [:

Mhmm.

Tracy Bannon [:

I'm okay that my doctor reminds me of my appointment by text, but I've agreed to it ahead of time. Other than that, if I don't know you, that's a personal machine that's strapped to my body at all times. You do not be texting them.

Hillary Coover [:

No. No. No.

Carolyn Ford [:

Well, so, Hillary, you just a few months ago did a spot on 505 about cybersecurity stress tests. And I'm very interested to hear how that can improve election security.

Hillary Coover [:

I mean, stress tests simulate attempt to simulate real world attacks and scenarios to identify vulnerabilities proactively, and so with the goal of staying ahead of attackers. And so there is the value in it. Is it perfect? No. Of course not. But as as we're able to monitor elections and election tampering worldwide, which Trac reported on, I think we are getting in an increased amount of data to be able to support these stress tests and help ensure the integrity in our own elections coming up. Mhmm. But yeah.

Carolyn Ford [:

So for the layman, I know what it is, but I'm just testing you. Explain what a stress test is.

Hillary Coover [:

So it's it's a simulation. So, you know, an example of it would be, what would we the tabletop exercises are are one format of them, and you can say you can you can build out these scenarios based on actual events, for example, the Biden robocall. And you can say, okay. This happened and reached 15,000 citizens. What do we do? And what what are the ramifications of everything? And what what happens? What's the fallout? What's the and you enact this whole scenario based on either something that's actually happened or something that could happen, and it's all with the goal of trying to think ahead of your adversary and be proactive about it. There are far more, offensive and technical versions of of stress tests as well. Trac would be far more qualified to speak to those. So I'll pass that.

Carolyn Ford [:

So you're actually testing the systems, like you're red gaming the systems? Mhmm. Okay. Mhmm.

Hillary Coover [:

Yeah. The red teaming the system part, that would be more up Trac's alley. But, like, the actual simulations and coming creatively coming up with, you know, hey, based on what happened in India, let's let's design its tabletop exercise around that, or what happened over here. Like, let's let's take in some some information and try to be creative here and be as innovative Ford more innovative, hopefully, than our adversaries. Well, and, you know,

Tracy Bannon [:

to to that point, there are a couple of different types of of tests and simulations, that we have to run. Like, so what she was talking about is really getting after that, the simulation and understanding, emulating how that how that information gets out there. What's the likelihood that different people react in different ways to it? When we talk about stress testing of systems, for example, the voting systems, they are, enlisting cybersecurity experts now to help them stress test these machines. Meaning they put them under under duress, they put them under stress. So the connection is not quite what it needs to be. What happens if things are going through really rapidly, if somebody tries to, you know, different we call them attack vectors if they try to come in through different attack vectors. So they're having security pros come out because he's originally when these, not all systems are built secure by design. You'll hear me talk about this all the time.

Tracy Bannon [:

Hillary probably is Carolyn. No, you're not. I don't see eye rolling, but yes, I talk about secure by design all the time because oftentimes, as systems have grown over the years, we bolted on security as an afterthought. You know, and so we'll just, just, you know, kind of protecting things after the fact. They are now running these red games. They are doing, their teaming, to get after this and make sure that the that there's, at least if you go to where those machines are and you vote with that machine in that town hall or in that school or in that church or synagogue, what have you, that piece of it will be accurate.

Carolyn Ford [:

Now, so there's 2 different kinds of stress testing. I think you guys make sure I'm following. So trace, when we stress test the systems we're, we're we're we're red teaming, we're white hat hacking. Yes. Okay. Hillary, like we might do these tabletop exercises, stress tests, like disinformation, the whole robo Biden call. You would come up with these different scenarios and then just talk them through. They're not necessarily IT systems or secure, you know, cyber, but they could be like it.

Hillary Coover [:

I think

Hillary Coover [:

I think it's all important

Carolyn Ford [:

because they're not separate. There's the social engineering part that flows into the systems for the stress testing. Yeah. Am I just proud of here.

Tracy Bannon [:

We're no. No. No. No. No. We're we're tying it all together. So okay. What Hiller is talking about is that forecasting where the threats are, and simulating where the threats could be.

Tracy Bannon [:

There comes a point, not everything flows into a machine that's over at my town hall. Right. So if it doesn't, what does it look like if the information is coming? You know, are votes going to be swayed? That would be something that isn't necessarily going to happen at the machine Senell, but the votes could be swayed because of public opinion. Right? Through those all of those people getting their news, like Hillary said earlier, don't get your news on social media. Although I just popped a number out here that said that 86% of us adults get some of their news from their smartphone, computer or tablet. Well, but that doesn't mean social media.

Hillary Coover [:

That could be an app.

Tracy Bannon [:

Where do you start and stop Hillary, let's go back to this discussion. Because I can put comments at the bottom of every one of those CNN articles or BBC articles. I can put it's become social. Yeah. That's true. Intersectionality peeps.

Carolyn Ford [:

So, Hillary, I wanna go back really fast to what you talked about with disinformation. You basically said there's not anything we could do about it other than know that it's gonna happen and be as ready as we can?

Hillary Coover [:

Yeah. And I think, honestly, there's also the nuance between mis and disinformation. Disinformation is a deliberate attempt to falsify information, whereas misinformation is a little different. But the and thinking about the motivation of whatever you're reading, like who who benefits from some sensational story about, you know, a world leader, a politician or something like that, I'm thinking about, like, could this be, you know, an Iran backed campaign or a Russia backed campaign, or a China backed one. Right? And so and instead of looking at, some of these things as as, you know, sowing like, the sowing of discourse and polarization between, you know, Republican and Democrats Is just so much of it is foreign influenced. And with with the with the actual intent of of, you know, sowing discord. Right? So,

Carolyn Ford [:

that's a really good that's a really good thing to think about. Who benefits from this? And I go back to my mother not wanting to get Tech COVID vaccination because they were gonna nano chip her. And I'm like, really? Who benefits from that, mom? Nano chip you to what end? Plus you already have a pacemaker, so you're nano chipped.

Tracy Bannon [:

We enter a very slippery slope of the continuum of truth nowadays. So there is no black and white truth anymore. There are shades of gray. Because my opinions would be based on looking at the science that I look at and what you look at may not agree with a third person looks at. So there's a selectivity. Twitter X, not foreign. It's not a Twitter Tech, not LinkedIn, not foreign. So It's echo chambers that we get access to.

Tracy Bannon [:

They are. Exactly. So

Hillary Coover [:

foreign camp influence. Oh,

Tracy Bannon [:

true. True. Very. These are but these are our

Hillary Coover [:

going. Yeah.

Tracy Bannon [:

Yeah. So there's I struggle with what we need to do to those social media with those social media platforms to combat it because there has been so much suppression of good information, along with the suppression of bad information that I'm concerned about, and I mentioned this in my report on Friday. I'm concerned that in our effort to police that we're gonna be suppressing as much good as bad.

Hillary Coover [:

Yeah. Just this morning, there was a, an article about a couple of states that were looking to treat social media companies and the content published on those platforms in their at the state level, just like they would a utilities company, and so moderating to kind of taking ownership of content, and I don't know how that would work. I don't know if that will ever pass or or come to fruition.

Carolyn Ford [:

But you guys can we start a book club and will you guys read this book? It's called Ministry of the Future Ministry

Tracy Bannon [:

Ministry of the Future. It sounds interesting. I'm writing it down.

Carolyn Ford [:

We we need to circle back and talk about this book because it addresses the social media conundrum in a very interesting way. I would love to know what you guys think about this. And it's, actually, Obama said it should be required reading for everybody. It's it's it's kinda focused on the environmental problem, but as we all know, it's all connected. So okay. So I sidetracked this for a second, but I'm serious. Like, we're gonna do we're gonna loop back to this and have a little book club episode. Tracy, I wanted, I wanted you to talk about, you've done a lot of reporting on election fraud, like worldwide.

Carolyn Ford [:

So, which I would think. Hillary's stress Tech, like feed off of like, they're like, oh, this happened here. Let's see what would happen if that if we injected that same scenario in the US election system. So what what are we seeing worldwide with election fraud?

Tracy Bannon [:

Well, we just need to keep our eyes on what's going on in India for their elections right now. So deepfakes are out there and folks are not necessarily sophisticated. In cities, right, it's same thing that we have here where you have hubs, where you have good connectivity, good, you tend to have a better able you're better able to discern it, but folks are not able to discern what's real and what's not real. And so especially when you're talking about the not a 2 party system. There are many, many, many, many, many, many parties. When I was in India for their election in 2011, it was just amazing to to see how many candidates there were. You know, 15, 17 candidate, just like this amazing bifurcation. But as we've been talking about this today, it keeps coming back to me, and you guys are gonna call me like, you know, like, I don't know.

Tracy Bannon [:

Call me crazy. I think we're getting back to the point where we need to see it with our own eyes, where we need to talk to people.

Carolyn Ford [:

Tell Trac. That's what I say. Like, we gotta go back to our grandparents, and talk face to face.

Tracy Bannon [:

How did it scale before? Well, they went to church. They went to dinner. They went to communities. They went to they had meetings. Right? They those things didn't have to go away. They didn't have to go away. As a society, we kinda chose it because we started to get very insular, right, in our in our in our little circles. What do we call was cul de sacs, the cul de sacing of America.

Tracy Bannon [:

For houses, we know the names of those. We don't walk the streets anymore. We don't know folks. I'm wondering if we start to tie this in at a mega level, the breakdown of community has led us has driven us to be online, more and more online. So we're engaging less. When was the last time you walked out of your house and talk to your neighbor about politics?

Carolyn Ford [:

I mean, I try not to do that even with my friends because it can be really Yeah.

Hillary Coover [:

I was gonna say, it's kind of a taboo topic now.

Tracy Bannon [:

But it but it shouldn't but it shouldn't be. It should be it should be that I can say to Carolyn, anybody but Biden, or anybody but Trump. And I should be able to say that, and then we can discuss, well, why? Well, you know, I have this opinion about these events. This my thoughts about border protection. This what I've observed. This what I've seen. And we should be able to have that dialogue, and you shouldn't hate me and hang up on me. And yet, people are now feeling as though they have to seek out it's mostly gonna have a have a secret handshake.

Tracy Bannon [:

So I know whether or not it's safe to tell Carolyn what I actually think. What has happened that has driven us into such a Heidi hole hiding this way. It's crazy. So I do believe we're gonna have to, I don't know how it's going to scale. I don't know if it can scale, but I can tell you what won't work, what we're doing right now. I don't have any way that I can. I have nothing that I can click on that will say this valid. I don't have any engines that I can push things into easily as an average American.

Tracy Bannon [:

That'll tell me, oh, yeah. Because I'm definitely not gonna go to the Snopes website, and I'm not gonna go out here to the fact checkers who label everything as fact or fiction because we've seen over the last 3 years that those labels are not always equally applied. So I don't know. Where do we turn? Hillary. You're young. What's the answer?

Hillary Coover [:

I don't have the answer. I mean, you're right in that what we're doing right now is not going to work, but it's also the community and in person engagements is also not going to scale. I don't know anybody changing this moving too far from the status quo anytime soon because of the reachability of these platforms. And, you know, even with the risk of deep fakes, and misinformation, and disinformation, you still have a platform that can reach exponentially more more viewers, readers. I mean, until the until the trust in that is truly broken among the public, which I don't think it is, and I think it's gonna take a lot to totally break. I see us staying where where we are. Right.

Tracy Bannon [:

Yeah. I agree that it's not scalable, but I'm still going to seek out. I'm still gonna seek it out, and I think all of us have the imperative to to start seeking it out. It is darn convenient to just flip my phone open when I wake up at 2 in the morning and scroll through the news and figure out, you know, what I'm going to do there. I do think another piece that we can take into consideration is as the elections come, everybody has an election place of a polling location. It will help us all if everybody who is able makes a really, really hard attempt to vote in person as opposed to through the paper ballots, because the paper ballot system and how that's how the way there have been duplicates printed. You can look up some interesting facts about Pennsylvania and what they did when they found out they had duplicates. Oopsie.

Tracy Bannon [:

We're going to see some misinformation, disinformation, voting fraud still there. So some of the simple old old school things. If you can vote in person, make every attempt to vote in person. Try to not mail in a ballot. Remove that one extra layer of obfuscation from it.

Hillary Coover [:

And help your neighbors vote in person. Absolutely.

Tracy Bannon [:

Absolutely.

Carolyn Ford [:

Okay. Because I'm just thinking this through, and I'm not gonna lie. I'm not gonna go vote in person. I'm not gonna go stand in line. Really? What if what if, Trac, here's a compromise for me. What if I get my mail in ballot and I take it to I don't just put it in the mail. I take it to the voting station and I drop it there.

Tracy Bannon [:

What have you, have you saved yourself any steps other than standing in line to vote?

Hillary Coover [:

Well, yes. Because this it's the standing in line to vote.

Hillary Coover [:

Do you have early voting, early in person voting? We do.

Tracy Bannon [:

Oh, so it's yeah. So Tech could be distributed a little bit more. So you might be able to stand in line for 5 minutes instead of 5 hours.

Hillary Coover [:

Yeah.

Tracy Bannon [:

I don't know.

Hillary Coover [:

I like getting the stickers too. You know?

Tracy Bannon [:

Oh, man. So do I.

Carolyn Ford [:

I get one in my mail in ballot.

Tracy Bannon [:

Because it's oh, jeepers. That's like, you might get it out a cereal box. That's right. That's right.

Carolyn Ford [:

So for the people like me, but I'm being so honest right now, like Mhmm. I'm gonna get my news from It's 5:05. I'm gonna get my new my cybersecurity news from 5:05. I'm gonna get my, I mean, are there because that's a huge step to say we've gotta everything's gotta go back to in person. Like, yeah, that would be great. And like you said, Hillary, it's it's not gonna happen anytime soon. Like, there's gonna need to be something drastic for that to happen. So what are some other things, like, that we can do?

Tracy Bannon [:

Does it get to digital rights? Do do we start to go back to digital rights management? Right? This was a huge deal 15 or 20 years ago to make sure we understood where did that recording come from? If you had, and we just call it, you know, just rip your CDs, right? That I ever did any of that ever. 0. But somebody ripped their CD and they might share a song.

Carolyn Ford[:

I did that all the time.

Tracy Bannon [:

Oh, mixtapes. You're sharing your age there a little bit. But yeah. So I don't know. It perhaps it is we need to find the science. My husband I were talking about this as we call him seniors. Senior and I, we're we're taking Tech while. We're like, does blockchain factor into this? Does the direct management factor into this? Like, is there a way that I can have like, we just like we have certificates for authentication. Right? So is there a certificate that says this email actually came from Tracy? Are there, you know, are there what other things can we do?

Carolyn Ford [:

I remember the digital wallet thing. That was a big deal 20 years ago. In fact, I worked for a company that we worked on it. Or what about, like the generative AI, like forcing source When when AI spits something out, it's gotta give you the sources for where this stuff's coming from.

Tracy Bannon [:

That one is particularly hard, but you'll notice.

Hillary Coover [:

Beta provenance is always hard.

Tracy Bannon [:

Well, but it's it's it's even harder when you're talking about the mathematical statistic. Right? The it's the probability that 2 words are found together. And now we're talking about not I mean, it's not sequential. We're talking about and the reason that they say that they have all these billions and billions of parameters, how many how many connections between any one phoneme and any one other phoneme. So it gets really difficult to track it back to how many possible sources that were there. Now if you use perplexity, perplex city.ai, it provides you all the sources. It tells you. And if you get something that doesn't have the source, you're like, oh, I'm going to regenerate that question.

Tracy Bannon [:

I'm going to ask it for sources on this. And then you start to go and check the sources. And some are dated, and some are no longer valid, but at least you know. At least you know.

Carolyn Ford [:

Yeah. So okay. Sorry. Go ahead, Hillary.

Hillary Coover [:

Oh, no. When Bard first came out, they had sources, and that was, like, kind of the difference between Bard and ChatGPT. And I was so excited about it, and I went to do a demo for somebody to, you know, get them excited about being able to see the sources, and then they took it away. And I was like, I don't know if they've since readded it with the rebrand all that, but, yeah. Ford I was I was pretty excited when that came in. But, yes, perplexity is a great tool.

Tracy Bannon [:

Well, so that'll take us down another rabbit hole. I'll just take a just a quick side hustle down this rabbit hole. Uh-oh. So in in attempting to fix, right, some some of this election fraud could be deep fakes and misinformation. Well, we could pivot and talk about bias. And that much, a lot of the bias that people are talking about models, it isn't that some data scientists somewhere and nefariously built in bias. That's not what it is. It's that the corpus of data that's being used to train.

Tracy Bannon [:

So if I train a model in the US and the majority of the people writing the newspapers are, I'm and this not true, but I'm gonna say it. They're cis-gendered white guys from New England. Well, then the news is gonna be biased towards From their perspective. And other and from their perspective. Yep.

Carolyn Ford [:

Absolutely.

Hillary Coover [:

What happens when we're trying to address bias? We just saw that. We just saw what happened with Google's Gemini over the weekend where they had to take their model offline. Someone asked it to generate German soldiers from World War 2, you know, and you don't have, an African American or a black Nazi. You don't have a Chinese or a Japanese, an Asian woman. I couldn't you don't. All of the generations were incorrect when you asked Ford, you know, wanna see the founding fathers of America, and they put in somebody who is obviously, Asian, Indian, Asian, as a founding father. So what they, they were trying to correct bias, but they forgot about what's called historic nuance.

Tracy Bannon [:

So they there's a whole lot of other crazy to consider things that we're going to have to understand because some of the information that we're talking about, it could be that it's biased, that we're getting biased election information because a certain group of people just believe that because the corpus of data that the generative AI is getting trained from

Carolyn Ford [:

Yeah.

Tracy Bannon [:

Reflects a bias.

Carolyn Ford [:

Yeah. Less nefarious. I promised you both, but I would end us before the top of the hour as much as we're gonna keep going.

Tracy Bannon [:

We're having so much fun now.

Carolyn Ford [:

Thank you so much for your time today, Hillary. Trac, always love talking to you, but this was super fun. We gotta do this again.

Hillary Coover [:

Thank for having me.

Carolyn Ford [:

And also everybody knows that, like, Hamilton is Puerto Rican. Right? Thanks for joining us. Listeners smash that like button. For more from Tech Transforms follow us on LinkedIn, Twitter, and Instagram.

About the Podcast

Show artwork for Tech Transforms, sponsored by Dynatrace
Tech Transforms, sponsored by Dynatrace
Tech Transforms talks to some of the most prominent influencers shaping government technology.

About your hosts

Profile picture for Mark Senell

Mark Senell

Mark is Vice President of Federal at Dynatrace, where he runs the Federal business and has built out the growth and expansion of the Federal sales team providing unparalleled observability, automation, and intelligence all in one platform. Prior to joining Dynatrace, Mark held senior executive sales positions at IBM, Forcepoint, and Raytheon. Mark has spent the last twenty years supporting the Federal mission across customers in the U.S. Department of Defense, Intelligence Community, and Civilian Federal agencies.
In his spare time, Mark is an avid golfer and college basketball enthusiast. Mark earned a Bachelor of Arts degree from the University of Virginia.
Profile picture for Carolyn Ford

Carolyn Ford

Carolyn Ford is passionate about connecting with people to learn how the power of technology is impacting their lives and how they are using technology to shape the world. She has worked in high tech and federal-focused cybersecurity for more than 15 years. Prior to co-hosting Tech Transforms, Carolyn launched and hosted the award-winning podcast "To The Point Cybersecurity".