AI or Not
Welcome to "AI or Not," the podcast where digital transformation meets real-world wisdom, hosted by Pamela Isom. With over 25 years of guiding the top echelons of corporate, public and private sectors through the ever-evolving digital landscape, Pamela, CEO and Founder of IsAdvice & Consulting LLC, is your expert navigator in the exploration of artificial intelligence, innovation, cyber, data, and ethical decision-making. This show demystifies the complexities of AI, digital disruption, and emerging technologies, focusing on their impact on business strategies, governance, product innovations, and societal well-being. Whether you're a professional seeking to leverage AI for sustainable growth, a leader aiming to navigate the digital terrain ethically, or an innovator looking to make a meaningful impact, "AI or Not" offers a unique blend of insights, experiences, and discussions that illuminate the path forward in the digital age. Join us as we delve into the world where technology meets humanity, with Pamela Isom leading the conversation.
AI or Not
E016 - AI or Not - Danyetta Fleming Magana and Pamela Isom
Welcome to "AI or Not," the podcast where we explore the intersection of digital transformation and real-world wisdom, hosted by the accomplished Pamela Isom. With over 25 years of experience guiding leaders in corporate, public, and private sectors, Pamela, the CEO and Founder of IsAdvice & Consulting LLC, is a veteran in successfully navigating the complex realms of artificial intelligence, innovation, cyber issues, governance, data management, and ethical decision-making.
Join us for an eye-opening conversation with Danyetta Fleming Megana, the trailblazing founder and CEO of Covenant Security Solutions Incorporated. From her unexpected start as a civil engineer to becoming a global authority in cybersecurity, Danyetta's journey is nothing short of inspirational. She gives us a rare glimpse into the early days of information security, the mentors who propelled her forward, and the global expansion of her company. Danyetta also parallels the growth of cybersecurity with the evolution of artificial intelligence, offering insights into regional differences in data regulation and security.
In another captivating segment, we tackle the critical yet often ignored aspect of secure product development. In the rush to bring new software products to market, security is frequently an afterthought. We advocate for a holistic approach that contemplates potential vulnerabilities from diverse perspectives. With a focus on the risks of using internet-derived or AI-generated code, we draw connections to the historical issues of data oversharing and current concerns about AI and data security. This segment underscores the necessity of embedding ethical considerations and robust security measures into the development process for truly reliable software products.
Finally, we delve into the vital role of cybersecurity training and policy development. Highlighting foundational controls like NIST standards and CMMC-1, we emphasize the importance of AI policy, ethics, and lifecycle management in modern training programs. We also discuss a collaborative initiative offering workshops on AI and cybersecurity to equip organizations with the knowledge to tackle emerging threats. The episode wraps up with a sobering look at the blurred lines between reality and digital threats, stressing the importance of personal connections and open communication to combat AI-driven malicious activities. This episode is a must-listen for anyone interested in the evolving landscape of cybersecurity.
[00:14] Pamela Isom: This podcast is for informational purposes only. Personal views and opinions expressed by our podcast guests are their own and not legal advice, neither health, tax, nor professional, nor official statements by their organizations. Guest views may not be those of the host. Hello, and welcome to AI or not, the podcast, where business leaders from around the globe share wisdom and insights that are needed now to address issues and guide success in your artificial intelligence and digital transformation journey. I am Pamela Isom, and I am your podcast host. And so today we have a special guest with us, Danyetta Fleming Magana. Danyetta is founder and CEO of Covenant Security Solutions, Incorporated. Daniela, welcome to AI or not.
[01:24] Danyetta Fleming Megana: Thank you for having me. I'm so happy to be here.
[01:26] Pamela Isom: Pam, I'm delighted that you're here. I was a guest on a webinar that we conducted together. You sponsored that webinar. The topic was pertaining to cybersecurity, AI, and risk management, which was. Which was a really good dialogue. I remember distinct discussions around the humans and what roles should humans play in the cybersecurity and AI, particularly the AI arena, in order to help make our solutions more secure. And it was an interesting dialogue. So we may get into that a little bit here, but we have a great relationship. I'm very thankful for it, and I'm just honored to have you on this show. We're going to start out by asking you to tell me more about yourself, your career journey, how you got to where you are today, and then tell me what it's like to be an international CEO.
[02:32] Danyetta Fleming Megana: Oh, wow, that's. That's a loaded question. So. Well, we'll start back at the beginning, you know, kind of like in the beginning, you know? Yep.
[02:41] Pamela Isom: Go there. Go there.
[02:43] Danyetta Fleming Megana: So I actually got into cybersecurity, I like to say, by serendipity divine intervention. I came out of school as an engineer, but I actually came out as a civil engineer. And when I came to work for, at that time, Department of Defense, they said, well, we actually need help in this thing called information security. And I think at the time, they were calling it accreditations, and they said, this is what you're going to do. And I said, oh, okay, yeah, and assurance, right.
[03:14] Pamela Isom: They're called information assurance.
[03:16] Danyetta Fleming Megana: Information assurance. And so you probably can remember back in the day, you just got books there wasn't like, ise squared. All this training they have now, I'm a little jealous of the younger ones. They have, like, an actual rule book and some guidance. Like, they just gave me a box, and we're like, yeah, just, you know, you figure this out and we'll help you along. And I was very blessed to have some really great mentors that came along and said, you know, she's picking this up. Let's. Let's give her as much as we can. And once I figured out what information security was, I realized at that point that this is going to be around a long time, that we're entering into a data age where everything is about information. There's nothing that we do interact with. Our entire lives are shaped by the data that we receive, that we use, that we create. And so very early on, I realized, okay, this is the field I want to be in because it's always changing. And then around 2003, I started covenant security solutions. And, yeah, yeah, very early on.
[04:27] Pamela Isom: That was early. Yeah, you're a trailblazer.
[04:31] Danyetta Fleming Megana: Thank you. Thank you. I guess I didn't realize at the time, you know, sometimes you're young and you're crazy. And so I was a little young when I started, but I also understood patterns and trends, and I understood that cyber was going to be big one day. And I remember people coming to me and saying, oh, it's always going to be a job that one of the large integrators or somebody else is going to do. And I said, no, no, people are going to want this. This is, this is going to be something that we're going to be talking about for a long time. And 20 years later and across, we're on four different continent right now. So, yeah, so we, we've been very blessed to be a part of the cybersecurity discussions from very early on. I won't say the beginning because it started long before I got involved, but early on through now and then also being a part of the discussions for artificial intelligence. And so, you know, you can kind of take some, it's very similar. And I think you and I have had that discussion about the two paths. They're kind of almost following the same pieces. So I can, I have a perspective coming watching cyber grow up and still trying to grow up, if you ask me.
[05:54] Pamela Isom: Yeah.
[05:55] Danyetta Fleming Megana: Looking at AI and realizing, like, I'm seeing some of the same stuff. They pitfalls. Yeah.
[06:01] Pamela Isom: And goodness. Same goodness and some of the same. Some similar pitfalls. Yeah.
[06:07] Danyetta Fleming Megana: You know, so we've continued to, you know, expand and look at cybersecurity in different jurisdictions now, which is always interesting because people have a different perspective on their data and their relationship to their information. And I think people are learning, and so we're learning as they're learning.
[06:26] Pamela Isom: Does it differ per continent, it does.
[06:31] Danyetta Fleming Megana: I would say the global north is a lot more regulatory. A lot of, you know, you've got your gdprs, your nist, you've got the reg, you know, a lot more. From a regulation standpoint, however, I tend to find that people in the global south tend to be more conscientious about what you, they're more conscious about what you get. Like, we have sort of this running, I would say, underlying, I don't know if you call it a joke or what you would call it, because people say it all the time, you know, what do you have to hide? Your privacy. What do you have to hide? You know, anytime you say to someone, hey, this is, these are the regulations. This is how you have to protect my information or someone else's, you know, there's always this whole, what do you have to hide? Whereas I find in other jurisdictions, people kind of already have a sense of, you know, what? It's not so much about what I'm trying to hide. It's just, do you have the right to know that? Right. You know, like, I'm not necessarily trying to hide something, but I'm just letting you know, like, there's boundaries that I've created for myself. And so it's a little bit different. It's, you know, there's a almost, I think there's a little bit more understanding of, you know, why it's important.
[07:51] Pamela Isom: Okay, well, I didn't realize that difference from a continental perspective, global perspective. I didn't realize that. But what I did notice is, I know, like, some countries have been leaders in AI, for instance, in AI regulations. So they've been leaders in coming out with, like, well, the GDPR was really about data, but, like, the AI regulations with the European Union and Japan has started to do more as well. I've been talking to some colleagues from Japan, and so I'm starting to notice, like, I knew that the countries had taken a stand and started to do more and more around regulations, but I never made the connection between the privacy piece and what are you trying to hide? Versus, you know, do you have a need to know? Right. So I think that that goes to what the US has always said around zero trust. And so I bet that really resonates with you when you're working with the countries, because it's like, it's not like you have anything to hide. It's just you're trying to use good common sense and make sure you're not just laying it all out there for no reason. Right. So it's smart to ask, like, opening my door and saying, okay, yeah, I'm not gonna sleep with the doors locked, because why should I? What do I have to hide? Really? Seriously?
[09:37] Danyetta Fleming Megana: Right. And those are the arguments you tend to get in, you know, western countries. Like, they're. They're more like, well, what are you trying to hide? Why can't I know? And it's like, well, everything everyone doesn't need to know. And that's, you know, that's getting back to the zero trust. And a lot of the issues that we're finding now is everyone saying we're going to go implement zero trust, but they don't really understand what that means. That means you have to define what is trusted.
[10:02] Pamela Isom: Yeah.
[10:03] Danyetta Fleming Megana: It's not a software program, and I've seen people. I've got zero trust. And when you look at it, it's really like an enhanced version of discretionary access controls or mandatory access controls. I know Jane touched this piece of information, but not asking the question, Jane be touching it at all.
[10:20] Pamela Isom: Oh, that's good. That's good. Yeah.
[10:23] Danyetta Fleming Megana: Jane's responsibility. What should Jane ought to be able. What should Jane be able to do if she's touching the information? Should you be able to view it? Should she be able to manipulate it? How much should she be able to manipulate it? Those are all questions that, because our paradigm is different here, we just move data around, and then after the fact, we're asking critical questions that had to. That should have been asked in advance, you know?
[10:50] Pamela Isom: I see. I see. So then considering what you just said, if I were to think about product development and, you know, there's. There's always issues, like every day there's something that's coming up that's surfacing along the lines of breaches. Along the lines. Not necessarily of breaches, but things we should be considering. New tools are coming out that we want. We are considering integrating. Sometimes you have a choice, sometimes you don't. So I want to talk more about that. So, in the product development realm, the example that I'll use is the concerns around today, around the tools. I won't say the vendor's name, but the tools that will take screenshots you as you're working around every three or so minutes and take screenshots of what's happening on your computers. Right. Whether it's your laptop, your servers. I don't have all the details, but I know that that is a concern. And that is not the only vendor that has this type of capability. This capability has been integrated before now. So the question I have for you is, when we're looking at product development, what do we do? What should we be considering to better integrate cybersecurity so that we are protecting ourselves? And is that a good example of something we should be doing? Is the screenshots of what we're doing?
[12:26] Danyetta Fleming Megana: Well, I'll answer the last question first. Absolutely not. There isn't a need to take a screenshot of someone's machine on a regular interval basis, especially when there's other tools out there to save and capture data that doesn't, isn't so intrusive. And I think for this particular product, the way it's doing it, it's not protecting the information at all. It's basically leaving it out there for anyone to pick up and to retrieve. But back to the earlier part of your question. This is an issue, I think, that has been going on at least 30 years that I know of and probably even longer than that when it comes to product development. Oftentimes when we sit down and people sit down and do these software sprints, they're so focused on the functionality, they're focused on how do I get something out there that people are going to think is cool? Somebody comes up with an idea and says, okay, well, we're going to do a sprint and in the next three to six months we're going to have something out to market. And a lot of those discussions are really more or less financial driven. It's how fast can I get my new novel thing to the market? And what's typically missing in those conversations is security. And when someone says that they have security in that conversation, it's never from a use case perspective. And that's where it has to start. You have to not just look at it from, okay, well, we've got two factor authentication on it and we've got, you know, they make it very, very siloed. Right. So we're doing okay, well, we might encrypt the data, we might do these certain specific things, but we haven't taken the product and looked at it from a use case perspective to say, okay, well, how would a person interact with it and how would a person feel? When I'm sitting here watching my screen taking pictures nonstop and then I find out that the neighbor down the block has my passcodes, you know, and I never gave it to them, but they figured out how to get to my, my machine's recall, you know?
[14:35] Pamela Isom: Right, exactly.
[14:37] Danyetta Fleming Megana: You know, it, that whole dynamic about what it means to securely architect and to do so with speed that meets the market needs, that has never really been addressed because it, you know, to this day, when you say you're doing software testing, a lot of it is someone's coming in and doing pen testing. They're not necessarily coming in with use cases and saying, well, see, if I'm a creative twelve year old, what would I do? If I'm a really technical person, what could I do? And those things are necessary, especially when you're going and talking about the development of brand new software products, because any type of insecurity that's built in there, it just compounds. And especially in this day and age when, you know, we're from the times and I don't want to date myself, but, but I remember when people actually coded all the way at the machine level. Right now, people are coding and they're just coming in and they're taking pieces of code off the Internet. They may or may not even understand what, what's actually being fully done. And now you introduce AI to that. You've got people out here that are saying, okay, I need a piece of code that does acts of, and then it's showing them exactly what that looks like in python and they're popping it into that. So when we talk about being able to create secure products, it's the process that's been broken. And I don't think it's ever really been formally addressed and fixed. And I think it's also just this driver when it comes to, I've got to be the latest to get something out. And the latest may not necessarily mean that it's secure. And a lot of times, you know, and I remember this conversation back when the social media companies were coming out, right, and people were like, I don't think that's a good idea. You know, kind of like what we say now at AI. Like, I don't think that's a good idea. They were like, you know, but, but the prevailing thing was, oh, well, you know, people won't care that they're sharing their information online. And that's. And let's be honest, I mean, you know, for the most part, people have shared over, shared everything.
[17:03] Pamela Isom: Yeah. Yeah.
[17:05] Danyetta Fleming Megana: And so now we're sitting here with all this data out here and we're going back and we're saying, hey, look, you know, that that wasn't a good idea. We told you it wasn't a good idea. Let's not do this AI thing again. Let's think about how, how we want to do this. And, and you're getting the same sort of responses that, you know what? The public does it. They don't care.
[17:29] Pamela Isom: You are so, so accurate in the sense that, number one, when the product development life cycle is underway, it's like we are not visiting the metric. We have the wrong metrics. We have the wrong measures and we have the wrong metrics. We're looking at how fast we can get the product out the door because we're trying to compete, or as my mom used to say, keep up with the Joneses.
[18:01] Danyetta Fleming Megana: So we're trying to keep Mister and misses Jones.
[18:04] Pamela Isom: Yeah, exactly. And so, and we're not thinking, and it becomes really clear, so I would say it becomes really clear that we're not thinking of the implications. We're thinking about what I call the happy path. So we're thinking and operating based on that and not really looking at it. And when a cybersecurity person wants to approach something like that or an ethics expert, it's hard for them to be heard. It's hard because like, ah, yeah, right. Yeah, well, that's not gonna happen because that's not how this is. Oh, get out of here. Right. And so that's what we don't want. I don't know what the rationale is behind a lot of these things, but I do know that in this one particular example, and even in the past, it was about getting as much information out as possible, coming up with the novel idea and getting that information out as fast as possible so that we could supposedly communicate. But we also know that there's always a strategy behind this stuff. Right? So in this case, what's the real strategy? Is the real strategy to help to get data, to train the AI models. That's what, that's what is behind a lot of it, I think, is this is a way, yeah, you're, you're, you have screenshots. So. Because no one can say how this is really secure. So I agree with you when you said there's nothing, there's nothing appetizing about taking a snapshot of my screen every few seconds. It makes me wonder about storage. And now, is the, is the price of storage and the price of computers going to go up now because of storage or the price of cloud computing going to go up because this data has to be stored somewhere. Right. And so it makes me concerned about that. So I agree with you that there is no rhyme or reason as to why. I just don't see the value proposition behind it at all. And so either we didn't do a good job of explaining what the value proposition is, or we just missed the mark when it came to, okay, here's a great idea and here's an opportunity to allow people to back, get back information that they want to get back. It just doesn't make sense. It doesn't. I can't even make sense out of it.
[20:26] Danyetta Fleming Megana: Don't try. You're going to hurt yourself. But, but I think it also gets back to another piece that has never been here when it comes to cyber. And I think it's the same with AI. There's never an organization for which you have to be held accountable for what you produce. And so, you know, for example, if I go out and I want to sell peaches, right, like the FDA or the USDA is going to send someone out to my farm to make sure that I have safe practices that I'm not, you know, dumping bleach on my peaches and then trying to sell them to you at market, they're making sure that there's a process in place that you're following that process. Therefore they can ensure that there's a safe product at the end. That's the same with consumer product safety, right? You, you can't just go out and sell a toy to a child that has sharp, pointy ends, right? Because they know it's more likely for that four year old to fall on that and get injured than for them to actually use it as designed, right? Where is that for software?
[21:33] Pamela Isom: And it's supposed to be there. It's in the product development life cycle. The question is, do we need to evolve that methodology, evolve our approach? Because we've got technologies like AI that will magnify and amplify data faster than we could even describe. So maybe we need to be looking at how we evolve our whole methodology. I know Agile is still there and.
[22:00] Danyetta Fleming Megana: Agile is ever so important, but who's validating that? Agile is being used and that's where dating, like, where's that? You know, and people say, I hate government regulations and government entities, but there has to be an entity somewhere that their whole job is to look at what's in the best interest of public safety.
[22:21] Pamela Isom: I agree.
[22:21] Danyetta Fleming Megana: And when it comes to cybersecurity, it comes to AI as we can and beyond, right? Because these are things we're talking about today. Who knows what we're going to be talking about in five years?
[22:32] Pamela Isom: Quantum.
[22:33] Danyetta Fleming Megana: Quantum. And so who's sitting back and saying, you know what you said this product can do a, B, C and D? Who's testing to make sure that that software product actually meets those goals and that it's not being harmful to the consumers of that product.
[22:52] Pamela Isom: Yeah. Who's looking beyond the happy path?
[22:54] Danyetta Fleming Megana: Right.
[22:55] Pamela Isom: Right.
[22:56] Danyetta Fleming Megana: A company. I can, I can say, yeah, I looked at that.
[23:00] Pamela Isom: Yes.
[23:00] Danyetta Fleming Megana: I've got to say that I've spent, you know, two, 3,000,020, 30 million, some of them 2300 million, on a software development life cycle. I, you know, at what point am I going to sit here and tell you it doesn't work?
[23:12] Pamela Isom: Yeah.
[23:13] Danyetta Fleming Megana: Where's my incentive to say I'm sorry? Yeah, what's the incentive?
[23:19] Pamela Isom: So if we look at the whole discussion around opt out to get, considering that what we just discussed, we, some of these vendors are providing the option to opt out. Okay, so I'm just gonna say this. So I still have to read the policies around cookies and accepting the situations or opting out. I still have to, like, slow down and read it because it just pops up on my screen. And then you're like, in the middle of trying to do some things and you're like, do I need this or not? And what does this really mean? And we're doing, I think there's more work being done to try to make those data use policies and data use agreements more clearer.
[24:10] Danyetta Fleming Megana: I think it's difficult for a non technical person to do it, and I think they recognize that. Right. Because at the end of the day, data is still king. Right. Whether data is the new oil, I think that's been around for like, the last ten years or so. So the, the whole thing is just like you said, it's to pop up on the screen and you're, you just want off the screen as quickly as possible so you can go on about doing what you're doing. Meanwhile, you have allowed someone to continue to collect information for you that you may not want them to collect. And it's difficult because unless you have someone who can sit down or you're willing to spend time on YouTube to get a couple of videos on how to lock down your browsers, how to go through those policies and figure out what is saying and whether or not you even want to be on that site, most people don't have that kind of time. I mean, they're, they're moving information. I mean, you're lucky if somebody stays on your site for a minute and a half. That's called good conversion. In 2024, if you can get them to sit on your website for a minute and a half, you've done excellent. So, you know, reality is, is that, I think, you know, to me, this gets back to the conversation about accountability. You know, how, how do we build accountability and trust into a system where it's really, at this point, on an honor system? Right. And for individuals, when they say they opt out, you know, some of the social media sites that you opt out and they start collecting more data, you know, so, you know, your capacity to feel like you're being protected, that that's still in the hands of very few companies and organizations, and you have very little transparency into that. I think GDPR and some of the privacy regulations are trying to open that door a little bit to say that you could do a data, data subject access request.
[26:08] Pamela Isom: Yeah.
[26:08] Danyetta Fleming Megana: But even with that, I mean, what are you getting? Right. I just remember thinking to myself, because my background is more in security engineering, that's kind of where I was reared up in cyber. And I remember thinking to myself, like, right before GDPR and all the privacy regulations came out, I said, wow, you ought to see these database companies go through the roof. Right. Because you have to have a way to track what people are doing and what other way to do that right now other than through a database. But they didn't go through the roof. And the reason they didn't go through the roof is because everyone was hiring their lawyers. They had no intention of keeping track of what you doing on that platform, any more than what they need to be able to resell that data or to have that information to feed to their future AI systems or whatever. Like, they don't have that capacity really to sit down and say, okay, well, you know, I know, Jane Doe, this is all the information we have on you. So even if you filled one of those out, I would be hard pressed that they could even tell you everything that they have for you on your system.
[27:10] Pamela Isom: Uh huh.
[27:10] Danyetta Fleming Megana: The infrastructure is not there to make it happen.
[27:13] Pamela Isom: The infrastructure is not there. And I, the asset inventory, understanding what our assets are, is not there. You're saying the infrastructure is not there. The ability to capture what we really have and go back and reflect on what we really have really isn't there.
[27:33] Danyetta Fleming Megana: It's not there.
[27:34] Pamela Isom: So opt out. When I asked the question about opt out, because I think that was an interesting discussion, I heard you say, basically there shouldn't be a need to have to opt out. If you build it safely to begin with, then what are you. Why would you have to opt out? Right. So I heard you say that, which I agree with. And so, and I also heard you say that from a non technical perspective, is so complicated. It's just complicated. And so we have to work with people to help them to understand. So for me, I, and I'm getting to the training, the cybersecurity training. So I'm leading into that because I'm thinking about how we are providing training and working on providing training to help people. And me specifically, I'm focused on policies and what should the policies look like? How do you understand, what should you change about policies to support the era in which we live? And I will always say, and I, and I continue to say that we, it goes back to some of the fundamentals that we learned already around product development lifecycle, around project development lifecycle, around the NIST controls. Like, do you understand the NIST controls? Even the CMMC one. Right. Is good, right? Because so could, because those have the fundamental controls that set the stage for everything else. So everybody should be concerned with making sure that they're at CMMC one, whether you're going to do business with the government or not. Right? So some of those basic controls, and I heard you say that now, training, I provide training around policy, around AI policy, AI development, policy ethics, and making sure that we're looking at the lifecycle, as you mentioned earlier. But we've been talking about a collaboration where we do more around cybersecurity training and blending AI and cybersecurity and providing that training. Can you talk some more about that?
[29:39] Danyetta Fleming Megana: Absolutely. So what we're looking at is actually having a series of technology workshops that are going to cover different areas. We're starting with four areas at first. So we're, as you just shared, doing AI and cybersecurity. The other one is looking at ransomware. But from a risk governance standpoint, you know, what are the steps that you can do to make sure that you survive through a ransomware attack? We're looking also at threat intelligence. And what does that mean? How do you create and understand what threat means to your organization? Because it's very unique. I think people like to go out and buy these fees and say, now, I know all the threats, you know a lot of stuff, but what does that mean to your organization? Right?
[30:25] Pamela Isom: What does it mean?
[30:26] Danyetta Fleming Megana: What does it mean? You know, information without context is useless. It just is. And last but not least, we're looking at cyber sustainability. So when we look at cybersecurity, we're not just talking about how do we come in and lock, you know, databases down or things like that, but how do we actually make this a part of the goals and the needs of the community around us? So those are the four areas we're starting with, and I definitely going to encourage people to come out, support, take the time to learn. I think sometimes we get complacent, and I think that's true with all of us, myself included. We figure we know what's there. You just change your password, and it's bigger than that. And especially as we start to move 24 into 25, you're looking at all of these conflicts that are sparking up around the world. Cybersecurity is a lot bigger than changing passwords. There's a lot that goes into it. And part of the reason it's hard is because it's more than just talking about a hacker. And I feel like that narrative has gone on for so long that people think that, oh, okay, you can't do anything. The hacker just comes in, and then you just know that there's a lot that you can do. And so part of the technology workshops is to help organizations, whether they're nonprofit, governmental corporations, Fortune 500, small business. Our whole goal is to help get information out to you and to have a dialogue so you can help protect yourself and help protect our global village, as I call it.
[32:06] Pamela Isom: I agree with all of it, which is why I'm a part of it. So I want to thank you for including me in that effort. And I'm looking forward to the outcomes, and I'm looking forward to advancing some capabilities across the globe. Right, so you're the international person. So I feel like that gives us an opportunity to have more impact around the world. So that's a very beautiful thing. So I know we've had a really good conversation around opting in, opting out the technologies that are emerging, the product lifecycle. We've just had a good discussion here, but I'd like to know, is there anything else that you would like to discuss before you impart your experiences and our words of wisdom?
[32:51] Danyetta Fleming Megana: I think we covered. I think we covered the gamut.
[32:55] Pamela Isom: We did. It's been a good discussion. It's been a good discussion. Okay, so then let's have you. Go ahead. Do you have any parting words for us, myself included?
[33:04] Danyetta Fleming Megana: I think in these times, where we're going to continue to see cyber threats and the combination of cyber and AI threats expand is for you to. I think everybody has to take time to connect with each other, person to person, person to person. This is, you know, technology is meant to enhance our personal relationships and to expand them, not to replace them. And it's going to be so important, especially for families. You know, it's heartbreaking when you hear people cloning their daughter's voice off of AI and then saying that they're holding their child for ransom or you hear a business. I think we talked about that on our last webinar where the gentleman was in a completely AI cloned call. Yeah, it has impacts on all of us. So I think it's just important that we, you know, there's nothing wrong with the phone. Pick up the phone and call people take the time to connect because it's going to be important because we're entering a time where the real and the fake is going to blend. And if you don't know the people you're around, you're going to be in trouble.
[34:18] Pamela Isom: That's good insight. Especially, especially the fact that you're reminding us that the real and the fake is going to blend. So we have to know how to rightly divide. So in the policies and in the training and the coaching, even in the personal lives, like you said, you know what? And, you know, you've heard me say this before, but do our kids know how to tell whether something is real or not? And have we sat down and had those discussions to help them to say, hey, if you get this phone call, this is how you know if it's me, right?
[34:57] Danyetta Fleming Megana: Have you have a cold word? Have we talked about this? This is important for people, especially in these times as these conflicts continue to grow. No one is immune. You can't say, well, I just go to work and I'm just a software engineer at Google. Who cares? Yeah, somebody cares.
[35:15] Pamela Isom: Somebody cares.
[35:16] Danyetta Fleming Megana: Somebody cares. They care about their cause and they'll come for you just like they'll come for a senior leader in state government or national government. They don't. You know, at this point, we're in a whole new paradigm, so it's important we connect and don't forget about each other.
[35:32] Pamela Isom: That was so insightful, and I am so glad that we had an opportunity to talk today. And I'm so thankful to have you join me on AI or not. It's just so nice to have you here.