AI or Not

E031 - AI or Not - Andy Jenkinson and Pamela Isom

Pamela Isom Season 1 Episode 31

Welcome to "AI or Not," the podcast where we explore the intersection of digital transformation and real-world wisdom, hosted by the accomplished Pamela Isom. With over 25 years of experience guiding leaders in corporate, public, and private sectors, Pamela, the CEO and Founder of IsAdvice & Consulting LLC, is a veteran in successfully navigating the complex realms of artificial intelligence, innovation, cyber issues, governance, data management, and ethical decision-making.

Andy Jenkinson, CEO of CyberSec Innovation Partners and cybersecurity expert, doesn't mince words as he exposes how our entire digital infrastructure stands vulnerable despite decades of warnings. "We are fighting organized cybercrime with disorganized cybersecurity," he states, pointing to the disturbing reality that basic internet protocols like DNS (Domain Name System) remain largely unsecured by even the largest organizations.

Through shocking real-world examples, Jenkinson reveals how seemingly "sophisticated attacks" are often just exploitations of fundamental security failings. These breaches stem from basic negligence rather than advanced hacking techniques.

Particularly troubling is the organizational resistance to addressing these vulnerabilities. Jenkinson describes a culture of willful ignorance that prioritizes deniability over actual protection. Meanwhile, cybercrime costs have skyrocketed to roughly four times the annual budget of the Department of Defense.

For both individuals and businesses, the path forward requires returning to fundamentals. While personal safety measures like covering webcams and being suspicious of unsolicited contacts are important, true security demands organizational accountability. "If you don't know what your internet assets are and what your servers are doing, you have no way of protecting them," Jenkinson warns.

Ready to rethink your digital security approach? Listen now and discover what most cybersecurity professionals aren't telling you – and what might be putting your organization at risk every day.

[00:00] Pamela Isom: This podcast is for informational purposes only. Personal views and opinions expressed by our podcast guests are their own and not legal advice. Neither health, tax, nor professional nor official statements by their organizations.

[00:41] Guest views may not be those of the host.

[00:48] Hello and welcome to AI or Not, the podcast, where business leaders from around the globe share wisdom and insights that are needed now to address issues and guide success in your artificial intelligence and digital transformation journey.

[01:03] I am Pamela Isom and I am your podcast host. And today we have Andy Jenkinson. He's our special guest. He's a Chief executive officer at CyberSec Innovation Partners.

[01:16] Andy, we have some common interests, particularly around privacy and better securing Internet assets. I so impressed with you. I checked out your background and I just thought it would be good for you to share some of your wisdom and insights with those of us who are thriving in this digital world.

[01:36] So thank you for being a guest on this show and welcome to AI Or Not.

[01:41] Andy Jenkinson: It's my pleasure. Thank you for inviting me to start out with.

[01:45] Pamela Isom: Will you tell me more about yourself, your journey, and what's next for you?

[01:52] Andy Jenkinson: Look, we have one focus, and that's to make the world a safer place digitally. We can't do that in the kinetic war because far too many people have vested interests for wars to continue,

[02:04] and that's the military industrial complex. The reality, and my background, is that I've been in technology, in governance, risk compliance, and in the last six and a half years, in cybersecurity.

[02:17] What we witness on a daily basis is that cybersecurity, although something quite a new, almost novel, fashionable term,

[02:26] the reality is the Internet, from its early days of ARPANET to Internet over the last several decades, has left huge exposure to the entire world using the Internet.

[02:42] And that hasn't been helped by the fact that a number of the intelligence agencies wanted to use it for surveillance. And we'll all be familiar with people saying, oh, your laptop's bugged your phone.

[02:53] The reality is it is, but it's not done very securely. So that's the modus operandi that is used for surveillance is also used for cybercrime. If you stop and think about that for a minute, that's quite a frightening position because all of the tech giants in the world that have enabled surveillance,

[03:12] you know, you've seen films with various people supporting surveillance capability with what's known as backdoors, or the very same backdoors, that is the servers and Internet assets that are exposed, exploited, and compromised.

[03:27] Since Edward Snowden wrote his book permanent record in 2013, he told the world how that happened, how they were doing it. Okay. If you look and plot cybercrime costs and losses from 2013 to today, you'll see they go like a J hockey stick directly up.

[03:46] Okay. And that's your reason?

[03:49] I can wax lyrical about this. I've written four books on the subject. I was made a fellow of the Cyber Theory Institute back in 2022. But the reality is you and I have very limited opportunities to secure ourselves.

[04:05] We can only do very basic things.

[04:08] I use a cover on my screen camera because I know someone can turn it on without me doing it. Two, I make sure I shut everything down on a daily basis, and I use a cover on my phone.

[04:22] So I cover it so my camera can't be turned on. I can't stop it being recording and other things. But I make sure it's out of sight, out of mind, and out of earshot, if you will.

[04:34] So you'll see the films where people are paranoid about these things and they've every right to be in truth, Pam, because these things are being exploited and compromised for, initially for surveillance, but secondly for cyber crime.

[04:52] So there's quite an opener for you of my background and how we came to be where we are today.

[04:59] What we've then got to acknowledge is if that is the case, let's just say cybercrime is everywhere, and it is okay. It's about four times the annual budget of the Department of Defense.

[05:14] Losses to cybercrime annually.

[05:17] Think about that for a moment. The other challenge is that because of a lack of attribution and the lack of governance and compliance and regulatory capability,

[05:31] cybersecurity is rife.

[05:33] I always pose this question, and I'll touch on it in a minute. I pose on this question, if you had 1,000 shops, how many shops would you want to be guarded and locked on a daily basis?

[05:47] And the answer is all thousand.

[05:50] We're not doing that in the corporate world. In the digital world, we're leaving laptops on, we're leaving passwords exposed. We're logging on these things every day without thinking about what we should or shouldn't be doing.

[06:04] And that's our responsibility.

[06:07] However,

[06:08] people also have this false perception that their laptop or their phone is secure.

[06:15] It's not. So they type away and they may. You see these outrageous emails and people giving information away or transferring money or personal information online.

[06:27] None of that's secure, no matter what anyone tells you, whether it's two factor authentication or the company's got at the bottom. We take your security Seriously, it's hogwash because you press send and it goes to a server.

[06:41] It then gets transmitted from one server to another, to another, to another. Those servers, if any one of them are compromised,

[06:50] you're in trouble. And someone goes, oh, look, we got this whole flood of data coming in. Let me ask you a question.

[06:58] In the 23andMe example that we touched on, or the United Healthcare Group that we touched on,

[07:05] tens and tens of millions of people's data is exfiltrated and stolen and subsequently and consequently used for other crimes as well as digital identity theft crime.

[07:18] How do you think the perpetrators manage to get that size and scale of data? They do it because they take it from servers.

[07:28] They don't take it from your phone or a laptop, they take it from servers. In 2021, most people don't know this and as I mentioned, I never speak out of turn.

[07:37] It's public information that I'm sharing. Got an encyclopedic knowledge.

[07:42] In 2021, November, the FBI sent 100,000 phishing emails.

[07:49] Okay, go. Well, that's just crazy. How can that be?

[07:52] Well, in 2021, the FBI lost command control of their servers and the servers. The mail exchange servers were used to send the emails. Two things happened. One, the mail exchange servers were compromised and insecure.

[08:07] Two, they had all the email addresses stored within the servers, enabling the email to be sent.

[08:15] The fact that they were sent with malicious code or phishing emails to 100,000 people, of course, if you received an FBI email, you'd go, oh my goodness, it's an FBI email.

[08:25] I need to open it. What's the problem? You can understand how that lure of false security happens,

[08:33] but it wouldn't be the hundred thousand recipients problem because it would come through as an FBI email sent by the FBI server.

[08:43] Pamela Isom: So thinking about that and all that you said, because there is a lot there, but it's so true. We don't want to live our lives in fear, right? So we want to live our lives knowing what to do to protect ourselves.

[08:59] Knowing what to do and how we can better protect ourselves. And at the same time, we know that it's a village, right? Everyone has to do that because just as you mentioned, the servers, that's not, that's not the consumer, that's on the service side of the house.

[09:15] So what kind of advice would you give to business leaders? You know, because we should be looking across that entire supply chain. So I like the example that you used earlier where you share like you have the covering over your phone and the Covering over your camera.

[09:30] I have coverings for my personal stuff in my wallet. Right. So I have some coverings that I use. I'm very mindful when I use the keypad to do something to distort the heat after I walk away so that not easily transferable.

[09:48] So I'm mindful of some things. But what can we do better as consumers and then for the business leaders, how can we get better at this?

[09:55] Andy Jenkinson: Well, education is really key and in a study in Australia two years ago, you'll be familiar with the website. You've got the URL address bar at the top and up until very recently either had a padlock or it said not secure.

[10:10] And the not secure means it's got a configuration error. You may be going to a spoof website, an evil twin website or whatever,

[10:17] but the responsibility of that rely is dependent upon the owner of the website. But put that aside for a minute. In the study in Australia they surveyed 2000 people and 90% still went on a not secure website and put their data in their PII data.

[10:37] So people need protecting from themselves. That's a very scary statistic because everyone should. And whenever I go anywhere and talk to people, all of a sudden they're paranoid and looking at their URL address bar to see if it's secure or not, which is a great thing.

[10:54] But it's quite funny because this has been going on for decades. Nothing new. It's a little bit. I use an example, I assume you drive Pam.

[11:03] Pamela Isom: Yes.

[11:04] Andy Jenkinson: And when you change your car, do you think, oh, I need to remember every time I go in to fill up with gas, I need to remember which side to go.

[11:12] People do that. But on your fuel gauge there'll be a little arrow next to the little picture of the pump that'll show you which side the fuel cap is. That's been like it for decades.

[11:23] But the vast majority of people don't know and it's right in front of their eyes. Seriously, Seriously. You do a survey on that. Next time you get a minute and say, you know, how do you remember which side of the car the gas goes in?

[11:39] And I go, oh, you know, I have to remember it. It's right in front of them on the gauge. Get in your car later today or tomorrow and you'll look and you go, oh my God, there it is.

[11:49] So the challenge we've got here and I met with the home office this week in the UK for two hours we have fraud at unprecedented levels and I'm sure the Americans do too.

[12:05] We've Got people that are what we call the lower end of the cyber crime scale,

[12:12] phoning people up, typically older people with bank accounts, defrauding them of money. They do that. Now, this is just typical fraud on a daily basis.

[12:23] Not corporate. We move to corporate in a minute. But if, for example, Mrs. Smith, which is a good English name, Mrs. Smith gets a phone call from her bank, inverted common sense or.

[12:35] Mrs. Smith, there's been some malicious activity on your account today.

[12:39] Your account number is 12456789 10, and your sort code is da da da da da da da. And Mrs. Smith goes, this person obviously knows who I am. They're obviously phoning from the bank.

[12:51] But all they've done, Pam, is get the data from stolen data from a server that's compromised and it's been sold for pennies in the pound.

[13:03] And fraudsters are phoning up to try and scam Mrs. Smith out of money. What they then go through a narrative and say is, well, Mrs. Smith, look, if you bring it up on screen, you can change your password.

[13:15] Now give me the password. We'll actually make the changes for you this end. We'll make sure that we reimburse any monies that go out. Mrs. Smith believes everything she's told.

[13:25] Mrs. Smith gives the new password, the fraudster takes over the account, locks Mrs. Smith out, and drains the company bank account or her account. Okay? Now who's to blame for that?

[13:39] In the uk, laws just changed that says the banks are responsible to refund Mrs. Smith.

[13:46] We're talking hundreds of millions of pounds. Okay,

[13:50] why is that really important? Two things. One, Mrs. Smith has no idea she's picked up the phone. Someone knows her details, her home address, all of her PII data.

[14:01] But when you start analyzing it forensically, you see that the amount of fraud going on in any one particular bank. And I did some research on the bank, then shared with the.

[14:12] The fca, which is our sec,

[14:16] we showed that the bank's servers were compromised and had been for four years. Allowing somebody to take that data and to exploit it.

[14:25] Pamela Isom: Oh, my.

[14:26] Andy Jenkinson: The bank is 100% responsible for the hemorrhaging of data and the clearing out of hundreds of millions of pounds collectively from people's accounts because someone's contacting them. That's just plain vanilla fraud and scamming.

[14:44] I've taught my father, I won't use the words he used, but he gets a phone call every once or two times a week and he actually tells them to go somewhere else, okay, in a very night.

[14:56] Because after a while. And they kept questioning him. And he's 82. He had no idea what was going on.

[15:04] But he's what's known as the Silver Surfer brigade, where he knows how to his way around a phone and a PC,

[15:11] but he doesn't understand how people might want to defraud him and trick him into giving them data.

[15:18] That's what's happening. And the banks need to do more to protect them. I can wax lyrical about bank of America spending 1 billion on security but wasting 1 billion because websites aren't secure or their servers aren't secure.

[15:32] And they want to sue me because I've brought this to their attention.

[15:37] Pamela Isom: Well, that doesn't work.

[15:39] Andy Jenkinson: It doesn't work at all because I actually invite it. I say, let's go to court. Take me in there and I'll tell the world what you really are doing.

[15:47] Pamela Isom: Yeah, that won't work. But that's another conversation. That's a good one.

[15:53] Andy Jenkinson: There's two things here. Mitigation for Mrs. Smith or my father to be defrauded out of their wealth, whatever scale it be. But two, the liability flow. That liability must pass onto the owners of the business or the banks or whatever, because they've a responsibility, whether it be a compliance, a regulatory compliance,

[16:21] or just good business practice.

[16:24] Currently, Pam, they're not doing their job very well.

[16:27] Pamela Isom: Okay, so two points. Personal mitigation. Do what we can to be aware of the types of scams and to protect our assets in general. So be very mindful of the con artists and the scammers and know that they tend to target certain generations, certain ages.

[16:49] They tend to. But they don't stop there because they go after our youth as well. So be mindful of that. And then second of all, if you're a business leader, be accountable.

[17:01] So because there is a liability and you are liable for the damages that are caused as a result of your negligence.

[17:11] So I think about this situation that happened, we could talk about 23andMe, but that's. But first, here's something that's on my mind. So actually, there's a lot that has happened.

[17:20] So we all have experienced these things, but just recently I get this urgent call telling me that I didn't show up for jury duty.

[17:29] Seriously, you didn't show up for jury duty? I'm calling from blah, blah, blah, blah, blah. I want to know why you didn't show up. We want to help you because I've got a warrant here for your arrest.

[17:40] And I was just like, seriously, Right. Who are you? I'm Captain, blah, blah, blah, from the warrant office or something. And we show that you should have received a summons and that we dropped it at your door.

[17:56] All these things are red flags because I know that's not how it works. And is your address blah, blah, blah. And so I wouldn't comment.

[18:05] And so can you just confirm is your address blah, blah, blah. And so I was finally like, okay, look here.

[18:12] So I don't know who you are, ma'am. I'm captain from the police department in the warrant office. And we're about to come out, and I'm trying to prevent them from coming out there to get you and put you in jail.

[18:25] So I said,

[18:26] I'm knowing this is a con artist, but I'm just, like, listening because you never know. And so I'm listening. And I realized that they do this to people all day long.

[18:38] It's a crying shame. It's just a crying shame. This is not a digital threat. They got my digital data from somewhere, and then they called my phone number telling me this had my name, didn't quite have my address.

[18:51] So I could hear them trying to confirm my address. I'm not sure they had my name properly. But it's just a shame what we have to deal with. So what made me think of that is things that we can do to help ourselves.

[19:04] And so in that case, I said, let me let you talk to someone who I know is a police officer. So. And then I called this person, right, While this person was on the phone and they hung up.

[19:18] Andy Jenkinson: Yeah, absolutely. That's a good thing to do. But also another one is to say, give me your number. I'll phone you back. I can't take the call at the moment. So deflect their.

[19:26] And it's called pattern interrupt in. In psychological terms,

[19:30] break their pattern. So they're off on the back foot and say, let me take your number. I'm in the middle of something. I'll phone you back.

[19:37] Pamela Isom: Okay?

[19:38] Andy Jenkinson: Okay. And they invariably will just go, because they're playing a numbers game. It's really important for your listeners to understand that something around 30,000 websites a day are hacked.

[19:52] 30,000, okay. There's over one and a half billion websites, but it's a relative term. But if one of those websites is your bank or a website you frequent or an auction site website, and you think you're going on there and you're putting personal information in, two things you must do.

[20:13] One, if it says not secure, run, never go onto that website. Okay. If you know what you're doing. You can analyze it and see why it's not secure.

[20:22] And I'll give you a story of how this happens. But it doesn't always happen in the way you'd want it to.

[20:30] You remember in COVID 19, the lockdown, a lot of small businesses were struggling financially. And in Germany, like every other country, the government played a very fair game and said, we're going to launch a website that you can, as a small business, claim up to €25,000.

[20:49] All you've got to do is register, give us your information, we're going to analyze it, and we're going to award you a 25,000 advance.

[20:57] Well, they did that. They made it very public. And within the first three, four weeks, hundreds of thousands of people registered on the German government website,

[21:09] or at least what they thought was the government German website.

[21:15] Okay,

[21:16] what had happened then after four weeks, they started making phone calls. Because we all want an easy life. We all type away and expect a result. We don't expect to phone up.

[21:26] So these people making phone calls,

[21:29] they've got balls in lots of ways because most people don't want to make a phone call. They just want to type.

[21:35] So after four weeks, the government started receiving calls and complaints that the people hadn't received any money.

[21:43] Okay?

[21:44] So the government said, well, hold on, we've sent you the money. We sent it to you. Your name, your business, your account, everything you gave us. And they said, well, we've not received it.

[21:55] So they did some investigation.

[21:57] Pam, in two weeks, the German government paid out 100 million euros, which is not dissimilar to $100 million,

[22:08] to bogus cyber criminals that stood up an evil twin website that sat in front of the German government website.

[22:17] What it did was it took all the real information from all the real claimants,

[22:24] repopulated that information into the real German government website, changing one thing, the bank account details.

[22:33] So two things happened thereafter. The German government had to obviously close the website. That was a bogus website. The evil twin 2. There were 100 million out of pocket, but they still needed to support the people that were the illegitimate claimant.

[22:47] So it cost them twice as much.

[22:49] Pamela Isom: Yeah, that's just not worth it.

[22:51] Andy Jenkinson: Two weeks, Pam. Two weeks. €100 million. This is public information. 2022, 2021.

[22:59] Pamela Isom: I think that's terrible.

[23:01] Andy Jenkinson: I use the terminology. We are fighting organized cybercrime with disorganized cyber security.

[23:10] Pamela Isom: So what else can we do to protect ourselves?

[23:12] Andy Jenkinson: Well, look,

[23:14] the biggest challenge I see is that governments and institutions, large institutions,

[23:21] they may talk A good story, but they're not really doing their job very well. And I'll give you an example of that. I won't name the companies for fear of reprisals and litigation, but a local council here just signed a deal, a three and one year outsourcing cloud deal onto servers that are not secure.

[23:43] So they've gone from on prem to off prem cloud because it's centrally funded government. But no one in procurement checked whether that data would be secure in flight or at rest and it's not.

[23:56] So it's going to be compromised from day one. That's a terrible thing. We've got to lobby governments and departments and companies to do their job better because we can't fight with that.

[24:10] For example, if you went on to pay your water bill today, let's say you pay a water bill and it's $200 and then they go, oh, could you please make the payment?

[24:19] You make that payment. If that's a bogus website,

[24:23] they want to cut your water off, they've got to take responsibility that that website was enabled to actually do so. We had this in old Smile water in Florida, I'm sure you remember where they changed the chlorine particle.

[24:37] And it was only by sheer chance that one guy saw the cursor moving around on someone who wasn't sat there to change the particles. I think was 11 parts to 11 million.

[24:47] It would have killed people.

[24:49] Pamela Isom: He caught that vulnerability and saved some lives, that's for sure.

[24:52] Andy Jenkinson: Absolutely. But that's the sort of thing that, and in that case either Oldsmar are responsible or Oldsmar technology providers. Here's a really big issue. Third party risk is a throwaway term that genuinely is not adhered to and no one seems to care.

[25:14] That's a really worrying situation because when my friend, the solicitor that signed that contract against under duress for the council here in the uk,

[25:27] she asked us to look at the exposed position and the security posture of the organization. And we're talking about a global top 10 company in the world here, okay? In this space.

[25:41] I shared the information with her. She took that to the chief executives of the councils and they said we're not interested in security, we just want to migrate the data.

[25:51] They actually were told the problems, they ignored the problems and they go to bed at night and sleep very well. That's just outrageous. Because the other problem is, is if people genuinely don't know how to analyze and investigate post incidents,

[26:10] they never find the real root cause. So the real mitigation and liability upon the provider of these compromise services typically gets away with that.

[26:23] They say it was a sophisticated attack or it was because two factor authentication wasn't enabled their symptoms. The root cause is typically servers were compromised and enabled the accessibility to get in and do the dirty.

[26:40] Pamela Isom: I hear about credential stuffing a lot. What's your take on that?

[26:45] Andy Jenkinson: Same thing. Look,

[26:48] going back to the shops, if you had a thousand shops with high value products in each of them, and they were all connected, all right, like the digital world, all connected, as opposed to one in San Francisco, one in California, one in Florida.

[27:03] In the digital world, they're all connected. But take those 1,000 shops, how many would you want to leave unlocked? 24. 7. In a high crime rated area for all of the goods to be taken with no attribution.

[27:17] So we've made it worse for ourselves in many ways.

[27:21] And AI is another major area, as is IoT, as is BYOD, bring your own device. I was a guest chair in Copenhagen on an AI summit, and the summit was on AI and cybersecurity.

[27:38] So I sat there in this huge room in Copenhagen, which is a beautiful part of the world, and all of these delegates of the 60 largest companies in Denmark came in the room.

[27:49] CIOs and CISOs, okay, so they were technical and.

[27:54] And they all came in, sat down.

[27:56] I think there was four vendors there spending an absolute fortune to be there. And I'm chairing the meeting, so I've got carte blanche,

[28:04] which is just as well. So everyone sits down. Pam, I asked two questions before introducing my panel guests.

[28:12] And I said two things. One, how did you all get in here in this room?

[28:19] And they said, it's a very strange question. We came through the doors, said, how would you have got in this room if I'd have locked those doors? And they said, we wouldn't have been able to.

[28:29] I said, that's cybercrime.

[28:32] Lock the doors.

[28:34] If you can't get in, you can't take out. It's as simple as that. And the next question I asked, and I don't know how familiar you are with domain name systems, which is part of the Internet assets and Dr.

[28:46] Paul Mokapetra's first ever Internet protocol. I said, how many of you, of the 60 people in this room, all paying thousands to be here,

[28:55] how many of you manage and control your DNS and your servers on a daily basis?

[29:00] Two people put their hand up.

[29:02] Pamela Isom: They don't know what it is, Right? So it's almost like people have. So we did pay closer attention to it when Internet first came out, but now it's like everything is delegated to.

[29:14] And that's one of the concerns that I have, is we're not paying enough attention to DNS and the overarching Internet ecosystem and some of the foundational components which makes them vulnerable.

[29:26] Andy Jenkinson: Pam, look, you're younger than I, so I can wax lyrical a bit more than this than you, but I've studied this area for many years.

[29:35] The transition in outsourcing had a period that was, you know, finite instant. And you remember Blockbusters, the video store? Okay, yeah. Blockbusters was put out of business because Netflix turned around and said, well, we can stream this stuff.

[29:52] Yeah, you can just go online and stream videos. You don't need to, and we can automatically charge you. There's no cash. It's all budgetable.

[30:02] So in about 2010, 2011,

[30:06] Netflix and others, Spotify and Music system, they all went online.

[30:12] People are generally lazy, but they're also intolerant of any latency. Okay? So that was heralded a marked change in companies using their capex, their capital expenditure to run data centers and to shift it onto OPEX operating costs and outsourcing data centers, which includes DNS, domain name system and content distribution network.

[30:41] It proved incredibly successful.

[30:45] And you'll see now that data centers are springing up everywhere by the big players,

[30:50] but they're not very secure. But nobody's checking.

[30:56] You get your distribution, you get your DNS going on from hopping around. You can watch a film that's being played here and vice versa, but no one's checking the servers that are sharing that content or hosting that content, enabling access to it.

[31:12] They're not checking that they're secure.

[31:15] Pamela Isom: And it's because we don't know how. Is that why? Or some generations just don't know about DNS.

[31:23] Andy Jenkinson: They don't. And let me tell you why that is, Pam, because look, again, I've written about this extensively. In 2001, we had the Twin Tower atrocities.

[31:34] Up until that point, the NSA and GCHQ had been very active, exploiting the Internet for surveillance. And that's. I'm not going to beat them up on that. That's entirely up to them.

[31:46] I think it was a little bit unlawful. I also think it's a little bit inhumane. But nonetheless, they did it. There's nothing you can change about it. NOR I. In 2001,

[31:56] four months before the Twin Towers, Jim Gosler, who was the father of digital offensive capability at the CIA and nsa, turned around to the intelligence community at his retirement and said,

[32:10] unless we learn to adapt. The Internet will eat us alive.

[32:15] In 2001, not only did the governments turn around and go, we want total dominance. They also wanted to remove the ability for other people to defend against it. So you will struggle to find any academic organization within their cybersecurity training curriculum,

[32:37] add DNS of any depth.

[32:40] Think about that for a minute. So we've got this huge generation gap of knowledge not just because it suited someone or they didn't want to do. You don't go to the gymnasium once and you're fit for life.

[32:51] You need to go every day. It's no different to this. Let me tell you a real life story and I'm sure it'll be very happy. It's, you know, public information.

[33:01] Verizon are known as the cyber warriors of the world. They went into Target and other organizations when they had cyber attacks and they were dropped in.

[33:11] But Verizon adopted a framework to look at DNS as part of their red teaming.

[33:16] Pamela Isom: That's good.

[33:18] Andy Jenkinson: Yeah. What does that say to you? They never bothered to look.

[33:23] Pamela Isom: Yeah, that doesn't make sense to me.

[33:28] Andy Jenkinson: So look,

[33:30] America's more litigious than we are.

[33:33] If you were a company that spent tens of millions after losing hundreds of millions in a cyber attack and no one bothered to check your servers, wouldn't you be a bit hacked off?

[33:45] Pamela Isom: I would be.

[33:47] Andy Jenkinson: Now, isn't there a liability issue there too? You're the experts. Why haven't you done this so I can tell you a little story that one of the big four,

[33:59] they drove an eight hour round trip meeting to meet with me. And the global head of security said to me, andy, I'm hearing great things about you, but I don't understand.

[34:10] Tell me about it. So I told him what we do and he said, yeah, I think we do that. I said, no, no, you don't. Said you've been waxing lyrical about the $50 million you've been paid sub cyber attack on a shipping organization.

[34:24] And they. And you are totally insecure and have been for eight years.

[34:29] And I said, here's the evidence.

[34:32] So what we're seeing in Pam, and look what I'm highlighting. I'm not trying to embarrass anybody. I'm not naming names that aren't okay to name. What I'm highlighting is there is a huge disconnect and knowledge gap of basic security, fundamental security that without it takes away any other security measure.

[34:54] And that's a fact.

[34:56] Pamela Isom: So I agree with you. And I always talk about how we are forgetting some of those foundational elements that just basic hygiene can help us out so much and that we need to be mindful of and that we need to practice.

[35:13] And so you brought out some on this call which I think is good. And even for business leaders, some of these things are, are foundational items that we can take care of.

[35:23] I like to use the cmmc. Sometimes I tell people just go look at the CMMC level one. That gets you to the basic hygiene fundamental steps including backup and recovery and also closer monitoring of the service, as you pointed out.

[35:41] You're right. I still don't see a whole lot around DNS, pki, things like that. I don't see a lot around that. I do see and feel confident that even paying attention to that and say you're not doing business with the government, but say you're just wanting to be more of a steward of your assets.

[36:00] I do think that that's a good place to go and reference and get some checkpoints and validation.

[36:05] Andy Jenkinson: Pam, look, you've got a problem there with cmmc as you know, has been. It was a stillborn child over a decade ago and it's been going on ever since forever. It used to be five levels.

[36:17] Now is three levels, one and two actually demand DNS controls. They do demand DNS controls. They're not regulated and they're not adhered to. So let me tell you what I did last year.

[36:29] These are open source stories I'm sharing with you. I've got a report, if you ask me. I'll forward it to you after this conversation.

[36:37] In September,

[36:39] I think 17th, 2023, an F35 ejected the pilot and the plane carried on for six 60 miles and an hour and crashed into North Carolina. You may recall that.

[36:52] Pamela Isom: I do.

[36:53] Andy Jenkinson: And the pilot was ejected. He didn't self eject.

[36:58] What you might not understand is that there are three variants of F35. There's a trillion dollar business and economy. For Lockheed Martin and others.

[37:08] What happened is there is an A variant, a B variant and a C variant. The A and the C are short and long takeoff. So for aircraft carriers or typical runways, the B is the only variant that takes off horizontally.

[37:23] Okay, so it just goes up straight up, vertically.

[37:26] The B is the only one that the pilot doesn't have to interact with the ejector seat.

[37:34] It is automated because of the increased dangers of the vertical takeoff. Now what you may not know because it's not widely publicly known or published that in the last two years three F35 have ejected their pilots.

[37:51] All of them have been 35 B variants where the pilot did not need to interact with the ejector seat, but got ejected with without their permission or consent. Okay, so there's a hundred million dollar plane crashed in each occasion.

[38:09] Okay. There's three of them. Okay. Lockheed Martin the next day get an order for another one. 100 million revenue, very good business. Put that aside.

[38:19] We were pretty hacked about this because, and I'll tell you the sequence of events on the 11th of January 2023. I've got a memory like an elephant. I'm very lucky.

[38:29] The FAA in America closed the airspace. You may recall that too. Federal Aviation Administration turned around to the public and said we've had a technical glitch.

[38:40] Okay. Actually Pam, we helped them on that day. I've got an email to thank me for it because they lost command control of the servers.

[38:49] Okay. And the only way they could protect the the public and citizens using air travel on that day to avoid a digital 911 was to close the airspace. The NOTAM servers, which are noticed to airmen, you cannot fly a commercial or any other plane.

[39:09] So that means the U.S. air Force, the commercial planes and also Air Force One couldn't fly all the time. The NOTAM servers were, were not secure. They changed the digital certificate.

[39:23] I can neither confirm nor deny because I wasn't hands on enough and it's not under NDA so I can talk about it.

[39:29] I cannot confirm or deny if a ransom was paid and who the perpetrators were that may have taken over command control. So rolling on a little bit, when an F35 ejects a pilot and we start researching it, we go, well, hold on a minute.

[39:43] So we've had air closure, airspace closure. We've now got F35 jets ejecting pilots against their will. Let's look at the F35 a bit more. We did a report for J3.

[39:54] I'm not sure if you're familiar with J3. It's the highest level of clearance. We did a report for them and we showed Lockheed Martin, Chris Blake that make the ejector seats and also Greenhill software were all totally compromised.

[40:10] Which means if they could be compromised, so could the F35 program and the operating system and the ejector seat.

[40:21] Think about that for a moment. I'm not a conspiracy theorist. I'm sharing facts with you equally. When the Dali. The ship, the Dali hit the Baltimore Bridge, collapsed the bridge.

[40:32] What a lot of people don't realize is that that was done because a waypoint was added into the satellite navigation system.

[40:40] Okay? Which Took the Dali off course to crash into the bridge.

[40:45] A lot of people don't realize that there is a huge US Marine base on the inside of Baltimore that couldn't get their ships in and out.

[40:56] Think about the implications now. If you want to go to war, you need your fleet and you need aircraft supremacy and ability. And if you lose both because of incidents, let's call them incidents,

[41:11] all of a sudden someone's saber rattling and showing their hand.

[41:16] Pamela Isom: So tell me, I have two things I want to know. Yeah, this is interesting and very thought provoking. Right. So this will be very informative and helpful, I'm sure, to the listeners.

[41:27] Tell me your perspectives on AI. You started to touch on it a little bit, but, but then we, we started talking about some scenarios,

[41:35] AI exacerbating the issues. I know that it is to some extent, but what's your perspective on whether it can help or is it making things worse? What's your take on AI?

[41:48] Andy Jenkinson: Great question, and I need to go back a little bit in time to answer that and I'll come full cycle.

[41:56] When computing started, no one thought about security.

[42:00] Nobody,

[42:01] laptops, computers, home computers were sent out. No one ever thought about what do we need to do in case someone takes control of our system.

[42:11] Roll that out into When Paul Mokopetris invented DNS and in the mid-90s when PKI Public Key Infrastructure was invented to enable encryption and certification, et cetera,

[42:23] we add AI complexity to it and it actually exacerbates the problem. And I'll tell you for why. We use a number of technologies and tools. We have our own proprietary tools, but we also use open source intelligence.

[42:35] And as you'll know, most open source intelligence was developed by the American government or the American military to enable the gaps of Internet assets to be identified by security professionals so they could address them.

[42:49] Ocean technology was formed. It actually goes all the way back to the CIA back in the early the late 40s and 50s. Not technical, but you know, open source intelligence.

[43:00] If we then look at the fact that unfortunately people can be lazy, you know, we must all acknowledge that we all want an easy life and not many people do 14 hour days, seven days a week like I do.

[43:14] But most people want to do a 9 to 5 and go home and relax.

[43:18] And I get that. The point there is open source intelligence was developed to aid professionals better secure themselves. But the reality is it's better used to identify exposed and vulnerable positions to be exploited.

[43:34] Okay, now add AI onto that. An AI is being found guilty in lawsuits, etc. For plagiarizing documentations, for deep fakes, for videos. As we know, if we add this AI concept onto open source intelligence and 30,000 websites a day being hacked, and they're only being hacked, Pam, because someone isn't doing their job very well.

[44:00] They're not secure.

[44:02] Trust me, very few are brute forced. The majority are going, oh, it's not secure.

[44:08] We're treated very badly,

[44:10] as if we're telling someone they've got an ugly child.

[44:14] So they resist that. But what these guys do is then they pass it onto their technical and tactical attack teams who just attack.

[44:24] Okay, it's so it can take minutes if not hours, okay, for an attack to happen.

[44:31] So when someone says to me, as happened this week, one of the world's largest banks, he was the global CSO there. I won't name the bank because it would be too embarrassing.

[44:41] I'm not under NDA, but I'm being a little bit protective.

[44:45] Last week he approached me and he said, I've left that bank because they've had cost cutting and changes.

[44:52] I'd be very interested. I've been reading your stuff. It's mind blowing and you're perfectly right. We're not looking. We've not been looking. It's one of the world's largest banks in these areas and I'm very keen to see they're exposed.

[45:05] So I said it'd be my pleasure. Give me a few hours. Within a few hours, I shared some really damning stuff, including an Outlook exhausted subdomain of theirs for all their mail exchange.

[45:18] Okay. For one of the world's largest. So he's gone. Almost. And the digital certificate expired the week before. So it gives you some idea of just how lax the basic security capability is.

[45:30] Or was it deliberate inside a threat that I don't know and at the moment I don't care. So I share the information with him. He said, I'm meeting my successor at the weekend.

[45:41] So I said, great, share the information. I got a message back last night. I met with him and he said, yeah, it didn't surprise him. Well, that's okay then. One of the world's largest banks have got not secure subdomains, insecure servers.

[45:54] And the guy, instead of going, oh my God, could you come and talk to us? Come and help. He's gone. Yeah, I'm not surprised.

[46:01] So the point I'd make there, Pam, is, look, we all bank with banks. If the banks don't do their job properly, they need to be held accountable.

[46:11] It's no good just Keep as we see every day, the narrative is oh, it was a Russian guy, a Chinese guy or a Korean company, a sophisticated attack. It's not, it's because somebody isn't doing their job very well.

[46:27] Pamela Isom: And I think I also hear in what you're saying is you can blame it on tools like AI and other tools, but really those are taken. I mean they're used to take advantage and exploit vulnerabilities that already exist.

[46:43] Andy Jenkinson: Already exist. They're just ramping up the speed to attack and the exposure. So look, the reality is the. I talk from experience, but not firsthand.

[46:58] I would suggest that some of the gangs that they are labeling as such APT9029 or whatever,

[47:05] I should think they got more attack options than they know what to do with and they are selecting the ones that are most possible to pay out because they don't want the exposure.

[47:19] So for example,

[47:21] the United Healthcare group, the hospital group had cyber attack, okay? And I wrote to them on the day because this stuff comes, comes to me and gets flagged up. I wrote to them on the day and said hey guys, look, these are your servers, they're all exposed, you might want to address these.

[47:38] And they go oh no, no, it's fine, we've got experts coming in. Okay, well why they still like it today and you spent one and a half billion dollars, why are your servers still exposed and you're still maintaining not secure subdomains?

[47:53] How good are these experts? How good were your team in the first place, Pam, to enable this position? They're not asking for permission or forgiveness. They will just pay out of a trickle down economy in profits and instead of making 100 million they may show 50 million and lose the other 50.

[48:10] Because their mates are doing cybersecurity down the road doesn't mean there's any more security.

[48:15] There's your problem.

[48:17] So they come along and go, well would you go to a voluntary disclosure program, a vdp? And I go, hold on. So you've got a team of people you're spending millions on and they're not doing their job very well, but you want me to come in and to do their job for free?

[48:38] Makes no sense, Pam.

[48:39] Pamela Isom: It makes no sense. A lot of that, a lot of things that we've talked about really makes no sense. If we take full responsibility and if there are accountabilities put in place, which is I think a common theme that you've pointed out throughout this whole discussion, right, and take some personal responsibility,

[49:00] but also that the business leaders and these businesses should take and be accountable for their actions. And a lot of times that is hard to accomplish. There comes finger pointing.

[49:12] The issues with the individual who said, I'm not surprised, I almost feel like that person should be held accountable too because that's not good.

[49:22] Andy Jenkinson: We've got some new legislation called Dora.

[49:26] And Dora will have an impact to an extent because what it actually stipulates is people should be held accountable and they can't any longer use plausible deniability and say, oh, we didn't know.

[49:40] Ignorance is not a plea. And what we're talking about here in the mains is nearly 40 year old Internet protocols. It's not as if this just came out last year, we didn't get round to it.

[49:52] This has been around since 1986. Okay, so the reality is immaterial. Ignorance is a choice. Immaterial of your choice of saying, well, we've got a sock, we've got a seam, we do this, we do that.

[50:05] If you don't do what we're advocating and it's been advocated since 86,

[50:11] then you are your own worst enemy.

[50:14] Pamela Isom: Right?

[50:15] Andy Jenkinson: You cannot have one website or a thousand websites and their servers insecure and pretend everything's going to be okay.

[50:24] Going back to that group, whether it be Russia, China, Korea, wherever.

[50:29] If they've got like the, the United States healthcare group with 100 million people have been impacted from that now, Pam. 100 million, okay? They spent 1.5 billion to stand still, to be in exactly the same exposure.

[50:45] Somebody should be locked up for that. I'm sorry, that's the way I feel about it. Because we had it here in the National Health Service which you know, ours is funded government, you contribute as a taxpayer,

[50:59] but you don't pay an insurance premium or policy. You can have that separate. But in the main National Health Service supports. They had a cyber attack and it was in the blood transfusion group and you won't read about it readily, but people died, people lost their life because they had canceled operations for weeks and weeks and weeks.

[51:17] I shared information with the. It's nhbt.co.uk or whatever it is. I shared information with them at the time with their chief medical officer and their data protection officer. We play with a very straight bat.

[51:31] Yes, we want to be paid for our expertise, but we'll share the fact you've got a problem for free. We share the information with them. They threaten legal action.

[51:41] Pamela Isom: Because you told them.

[51:43] Andy Jenkinson: Because we told them they've got a problem.

[51:45] Pamela Isom: Well, isn't that interesting?

[51:47] Andy Jenkinson: Isn't that interesting?

[51:50] How bad is the insider threat.

[51:53] How much of the outcome is a preconceived outcome that they can knowingly not attribute any costs and losses to, but may all benefit?

[52:06] Could it be that some people are walking both sides of the street, creating the demand and having the supply?

[52:13] Let's not lose sight here that we're talking about a $10 trillion economy and marketplace.

[52:19] If we think drug ringing and trafficking is bad, this makes that look like a corner shop robbery.

[52:27] Pamela Isom: It is suspicious. It is definitely suspicious because you tell it like it is. And it.

[52:33] Andy Jenkinson: I only deal with facts, Pam. People. I was with the home office for two hours. I said, look, you will get people that say that Andy Jenkinson's a contentious person. And I go, they're wrong.

[52:43] It's only contentious because they don't like the facts and they don't like they know the facts because I'm telling them the facts. It's not an opinion, it's not a bias.

[52:52] It's not subjective. It is objective fact.

[52:57] Open source intelligence interrogates servers and websites to give you the output, what the servers give that OSINT technology, that's it. If I can show you that one organization is currently running 65 million IPv4 addresses on DNS, blacklisted data center and servers, you'd go, how can that be okay?

[53:21] And I go, what it is? And they're government servers.

[53:25] Pamela Isom: I would like to know as a last question that I typically ask my guests to share, but you've been given words of wisdom throughout this whole conversation. But usually as we wrap up, I'll ask, can you share any final words of wisdom or experiences for the listeners so they can kind of like walk away and think about something that you said.

[53:46] You've given a lot here. I would say for me that I like that. We got to remind people too, to go back to the CMMC whether they're trying to get level three or not.

[53:59] Go back, check out the requirements for levels one, and then you mentioned levels two. That will help get some of that basic hygiene in place. But outside of my perspectives, give us some parting words and experience.

[54:15] Andy Jenkinson: You'll remember the famous politician Rumsfeld. And Rumsfeld is famed for a number of things, and I think he was a very contentious man at the best of times. But what he did say is, we have known knowns, we have unknown knowns, and we have unknown unknowns.

[54:31] And what we're saying is, I don't care who you are, which organization, which bank, which government, whatever, if you don't know what your Internet assets are and what your servers are doing,

[54:44] you have no way of protecting them. You must have a full inventory of your Internet assets as connected to the Internet. Because if you're only protecting 90% of them, it won't be the 90% that gets exploited.

[54:59] It'll be the 10% that you don't know about.

[55:01] And that's as simple as a nutshell.

[55:05] Let me leave you with this, though. In your personal life, sometimes it's best not to know everything.

[55:10] In your business life, you need to.

[55:13] Pamela Isom: Yeah, that's good. That's good. That is good. All right. That is good. All right. So. So. Thank you so much.

[55:20] Andy Jenkinson: My pleasure.

[55:22] Pamela Isom: Sa.