AI or Not

E028 - AI or Not - Dr. Lisa Kinnard and Pamela Isom

Pamela Isom Season 1 Episode 28

Welcome to "AI or Not," the podcast where we explore the intersection of digital transformation and real-world wisdom, hosted by the accomplished Pamela Isom. With over 25 years of experience guiding leaders in corporate, public, and private sectors, Pamela, the CEO and Founder of IsAdvice & Consulting LLC, is a veteran in successfully navigating the complex realms of artificial intelligence, innovation, cyber issues, governance, data management, and ethical decision-making.

In this compelling episode of AI or Not, host Pamela Isom welcomes Dr. Lisa Kinnard, a seasoned engineer at the FDA, to discuss her journey in electrical engineering, AI applications in radiology, and the critical gaps in STEM education and diversity. Dr. Kinnard recounts her academic path, beginning at Howard University, and her pioneering work in medical imaging and AI during the early 1990s. She highlights the alarming disparities in breast cancer detection, particularly triple-negative breast cancer, which disproportionately affects young Black women. The conversation also delves into AI’s transformative role in radiology, from aiding in cancer detection to improving outcomes in rural hospitals. Dr. Kinnard emphasizes the urgent need for diversity in clinical trials and STEM fields, citing the underrepresentation of African Americans in research and education. Together, they explore actionable solutions, from early education initiatives to increasing awareness of STEM opportunities. This episode is a must-listen for those passionate about AI, healthcare, and equitable innovation.



[00:00] Pamela Isom: This podcast is for informational purposes only. Personal views and opinions expressed by our podcast guests are their own and not legal advice, neither health, tax, nor professional, nor official statements by their organizations.

[00:40] Guest views may not be those of the host.

[00:48] Hello and welcome to AI or Not, the podcast where business leaders from around the globe share wisdom and insights that are needed now to address issues and guide success in your artificial intelligence and your digital transformation journey.

[01:05] I am Pamela Isom, and I am your podcast host.

[01:09] I am excited to introduce our Special guest today, Dr. Lisa Kennard. She's an engineer at FDA. She reviews AI devices.

[01:20] She's a former research fellow at FDA and NIH, and we also have some USPTO experience in common.

[01:30] Alis, thank you for being a guest on this podcast and welcome to AI Or Not.

[01:36] Dr. Lisa Kinnard: Thank you for having me.

[01:38] Pamela Isom: So, to start out with, will you tell me more about yourself, your career journey, your travels into research, which is fascinating, and how you ended up involved with being a research fellow at these critical agencies?

[01:57] Dr. Lisa Kinnard: Okay, well, I studied electrical engineering, all three of my degrees. And so my first one, I was at Howard University, and I was, you know, studying the typical stuff, power and circuits and machines, not machine learning, but machines and all of these things, and none of them interested me.

[02:16] I got to my fourth. Fourth year or whatever, and I was just struggling with what I was going to do. I met a graduate student whose name is Dr. Frank DaCosta, and he was studying, at that time, medical imaging and radiology was what he was starting.

[02:33] And so it was fascinating to me to get into, you know, imaging and. And to. To work with. To combine the engineering with medicine. You know, that was. That was fascinating to me.

[02:44] And so I started working with him. He designed my senior project. From there, I went to the University of Iowa and continued those studies in radiology and imaging.

[02:56] And at that point is when I learned about AI. So this was 1994, 1993. I went there, and I'd worked on my master's, started working with a guy in the hospital.

[03:09] And so what was happening was that they were looking at these images of eyes, that they were histology images. And so what they were finding with these certain kinds of tumors is that the tumors would leave the eye and go to the liver.

[03:24] They were metastasized to the liver. And a lot of times when people get liver cancer, it kills them.

[03:29] But what they saw was that they saw patterns in these images. They were seeing that the way that the cells were forming a certain shape. And so that was my first kind of segue into AI or whatever.

[03:43] And I thought it was just spectacular. So I worked with that ocular melanoma project. That was my master's thesis. From there, made my way back to Howard to work on a doctorate.

[03:56] And I started working with a man, Xi Cheng Lo, who was at Georgetown University.

[04:02] And he was working in AI, or Computer Aided Detection and diagnosis is what. Is what we used to call it. So he was working on that, and it was. A lot of it was in mammography and, you know, breast cancer and things like that.

[04:18] He worked on that quite a bit. So at that point, I met yet another mentor at. At Howard. She's a radiologist, still a radiologist, who specialized in women's imaging. And from her, I learned about the disparities in breast cancer.

[04:35] So I'm not sure if people have heard of this, but there's triple negative breast cancer, which affects black women at a rate of about 50%. It is a very, very aggressive cancer.

[04:45] It has killed at least two friends of mine, and so there's not a lot of work in it. But once I learned those issues about mammography in the black community and then how much high the disparity rate is, you know, we.

[04:57] We die of breast cancer sometimes 40 times over, you know, compared to other groups, then I was hooked at that point, you know, and I said, oh, okay, so I can use this CAD work, you know, this, you know, writing programs to find tumors and help this group.

[05:14] Because I'm like, this is a very serious problem. But the triple negative breast cancer, what is also very dangerous about it is that it's not just black women. It's young black women.

[05:23] So my cousin who passed away, she was diagnosed when she was 40, and she died that same year. And so I've learned of others who have this particular type of cancer.

[05:34] And so that's when I decided that in my life that this was what I wanted to work on, you know, to help, you know, with the technical part, but to also, you know, really put a call out there, you know, as to this breast cancer disparity and how dangerous it is to us.

[05:52] Pamela Isom: Well, I, first of all, smile because I can. I appreciate the fact that you wanted to do something about it. I appreciate that. I appreciate. Because it's hard, right? It's hard when.

[06:07] When we see this going on and we're trying to do something. I mean, we just want to do something about it, period. So I'm glad that you were able to make that connection and started to focus in on what can we do and Started to use the tools to address this problem.

[06:26] We need more people like you. So that's why you saw me smile there.

[06:34] Let's go deeper into that. What is your perspective on AI when it comes to radiology and health? How does AI fit in the equation? I remember when CAD first came out, because I'm one of those that learned it a while ago, too.

[06:48] Learned AI a while ago, and CAD came out, and no one called it AI then. So tell us, tell me more about that connection between AI, radiology and health and clinical trials.

[07:00] Dr. Lisa Kinnard: So what it does is that it is not meant to replace the doctor. And this is where the problems come in, because a lot of people in the radiology community who believe that they are trying to be replaced, while that might be advantageous to hospitals and things like that, yes, they would save a lot of money, but there would be a lot of problems with that.

[07:23] So what AI is meant to do is to work concurrently with doctors. You know, it is not to replace anyone, but it is to assist them with finding cancers, with finding tumors.

[07:35] And so the way it helps in radiology is that the eye is limited to certain colors. When you look at a. At a radiological image, you'll see that it's shades of gray.

[07:46] It's many shades of gray. There's about 4,000. Our eye can only differentiate maybe 100 shades of gray. And so what that means is that there are things that we will miss, There are things that we can't see as well.

[07:59] Now that we're in the. In the 3D imaging realm, things are a lot better. But back in the day, it used to. It was just too deep. And so if something was hiding behind something else, yeah, you certainly couldn't see it.

[08:08] So the 3D is. Is helping. But we have found, and we have studied that show that. That the AI can help the sensitivity, so it can help find the cancers, you know, because.

[08:21] Because I'm back. You know, a while ago, it's. It's been maybe, you know, maybe 15, 20 years ago. Mammography alone, the sensitivity or its. Its ability to help you find a cancer was only 70%, you know, and so if you lose 30%, that's a lot of people with bre.

[08:36] With breast cancer being responsible for the second highest mortality in all cancer. Lung cancer is number one, breast cancer number two. And so 70%, it was decent, but it certainly was not high enough.

[08:48] And so it helps with things like that. It also helps with the generative AI. It helps with, like, report writing, things like that. I don't know if you've ever sat in a hospital and watched a radiologist or whatever, but they would read into a Dictaphone, which is like a little microphone, and then.

[09:04] And those things would be typed up. And so you can make mistakes like that. And so generative AI can help with the reporting and that kind of thing. But, but AI can certainly help with what, you know, is that our eyes are just limited, period.

[09:18] You know, most of the AI right now. I just watched a colleague of mine at work, he just was doing a presentation on how much more AI is being used in radiology.

[09:29] It's like several times. And so it's, there's just a lot in the field. And so it can just help. Another way it can help too is for things that have time limits on them.

[09:42] So, so one, one of the things that we use is that we have a field called triage. What triaging does is that it takes the images for things that have, like that you need to act on quickly.

[09:54] So things like strokes, that's what this is what we use triage for a lot. So with a stroke, a brain aneurysm, that is something that you have to act on very quickly.

[10:02] Person can but die very quickly. So what a triage device does is that it, it takes the set of images and it reorders them in orders, the order of importance.

[10:12] So things that it. So it will determine that. Oh, I see something. I want the doctors to read that image first to get that person to care as quickly as possible.

[10:21] So it's helpful for quite a few things. It really is.

[10:25] Pamela Isom: So you're saying that, for instance, AI would look for anomalies in the data. So if it's a radiology chart. So I can envision one of those charts that they're looking at.

[10:36] They're. They've done the. Run it through the scanner and they're showing it and they're looking at it. Sometimes they, they look at it on the computer screen. Sometimes they bring out the chart and they just have it up on the wall.

[10:47] And so. And I never know what it looks like. Right. I mean, I don't know what it means. Right. Information. And it has an image of stuff and it, and so I, I can see it, but I don't know what it means.

[10:59] You're saying that the AI would help to detect. Cause I know like for breast tissue, it can be dense, help to detect the tumors and things that may be hidden behind the density.

[11:11] Dr. Lisa Kinnard: Correct, Correct. It's like breast density. That's another issue in the Black community, a lot of us have dense ******* and a lot higher. And so you're talking about looking at white tissue on top of white tissue.

[11:22] And so what it is is that a computer can differentiate those very tiny shades of white that we can't see with our eyes is limited, just intersects. I mean, and so it may trace something out.

[11:34] And you, you didn't see that. But if we're talking about triple negative breast cancer, that's something that needs to be acted on quickly. As I mentioned, I know someone who died in a year from that.

[11:44] And so it needs to be acted on quickly.

[11:48] Pamela Isom: What's the connection between clinical trials that we talked about? So, so what does that have to do with this?

[11:54] Dr. Lisa Kinnard: So clinical trials helps the researchers to do their work. You know, I mean, and so what it means is that they need patients to, you know, sometimes try drugs, sometimes participate in, you know, there are, there are a lot of mental health studies out there.

[12:08] And so in order to get approval through our agency, the fda, you know, we approve, you know, all of the, all of the drugs and we approve the devices and we approve biologics.

[12:19] You know, those are our three centers. And so in order to get those approvals, we're not, we're not convinced that it, it has to be safe for the public. So that's, that's what our whole fda, the whole, you know, kind of, you know, our whole mantra is about safety and effectiveness.

[12:34] That is the most important thing. So we are not going to approve something if it has been tested on like 10 people, you know, I mean, you know, you need to have enough people to test it on so that we can release it to the public, because we don't want people to die because there's not enough people or whatever.

[12:50] So as far as African Americans in clinical, clinical trials, we comprise about 3% of people who participate in whatever. 3% is certainly not enough for us to be able to, you know, because as I said, there are diseases that affect us differently.

[13:07] And so we need enough people to participate in those trials so that we can determine that it's safe and effective, you know, not only for whites, but for blacks and for Latinos and for Asians and for others, you know, and so, and so there need to be enough people participating.

[13:23] Endos. What's happened, as you know quite well in the past, we have had the Tuskegee experiment and others that have harmed black people knowingly. They knew that they were going hard, but that they didn't care, that they just wanted to use us as guinea pigs.

[13:38] And test us and not inform us of the harms. And so people still carry that history with them. Even though things have been made a lot safer. What people could do in the past, they just cannot do anymore.

[13:50] I think, I think there's still a lot of fear. So we've got to get out there and you know, us as scientists and really inform people on the, the safety of those trials now and to really get people to participate.

[14:06] Because if we can't participate, then they're going to have this one drug or this one device and it hasn't been tested on enough black people. And there could be a difference.

[14:15] I mean, we have differences in things like our body mass index is different in a lot of cases. I mean, we are needed in these studies. And 3% is just, it's just not enough.

[14:25] Pamela Isom: So what about simulated data? So one of the benefits of AI is that it can generate simulated data. How helpful is that? Is there not enough data us to use the AI tools to simulate data?

[14:41] I mean, honestly, I don't want a tool simulating me. But you know what I'm getting at, right?

[14:46] Dr. Lisa Kinnard: The group that I used to be, so, so as, as you mentioned earlier, I was a researcher at fda and then I came back working in another capacity. So I worked, you know, I'm on the, I'm on the side that approves the device, but I still work very closely with the, with that group that I work with, the researchers.

[15:01] And so that's one of the things that we talk about. You know, we have AI meetings. We have these very, you know, kind of, you know, nerdy sciencey meetings with them.

[15:10] And so some of them are working on those simulated.

[15:13] Pamela Isom: But if you don't have enough data to begin with, that would be worrisome. Like, so you've got 3%, which is not enough to use to say the clinical study is proven.

[15:26] How are we able, do we have enough data to conduct to generate simulated data?

[15:33] Dr. Lisa Kinnard: That's a very good question. That is something that I would have to talk to the people, to the researchers who are working on that more. I honestly can't answer that question.

[15:42] But that, that is a very good question.

[15:44] Pamela Isom: Cause I would think that would be a concern. And so they probably have enough in certain areas.

[15:49] Dr. Lisa Kinnard: They may, and I'm not even sure with, even with the simulations, the simulated images that they're working on or whatever. I don't even know if that has gotten to that point where they're breaking things out into, into races yet.

[16:00] You know, I, I, I, I don't Think it's gotten that far, but it's certain. But they're certainly working on it. For sure. They, they're breast tissue phantom, we call them phantoms, you know, whatever.

[16:08] So there are, you know, things that people are working on. But as far as the race, I'd have to talk to some of my friends in that group and see what's happening with that.

[16:19] Pamela Isom: So ideally what we want to do is increase the success rates of the trials.

[16:24] Dr. Lisa Kinnard: Absolutely.

[16:25] Pamela Isom: So we need more representative data that the AI in itself will consume and draw inferences against. So we're not able to use the tool as effectively as we could because we don't have the baseline.

[16:40] We don't have the baseline data as much as we don't have enough of the data that we would like.

[16:47] Dr. Lisa Kinnard: We don't have enough people participating. I mean, we scrutinize very closely to something that we call, we call it subgroup analysis, where we're looking at what happens with this AI if it's on a different imaging machine, what happens when it's on a different group, what happens when it's on a different age, when it's on a different gender.

[17:04] So we, so subgroup analysis, we subtest all of those things. And so we as, as lead reviewers, if we don't have enough, we tell them, nope, you don't have enough here.

[17:14] And so they're not, they're not going to get through us until they have enough.

[17:17] Pamela Isom: So, yeah, that goes back to some of the governance practices that we talk about from, from an AI perspective at large. Few things that you pointed out in this discussion, you were talking about, you were not talking about AI, you were talking about this situation.

[17:34] Right. And the gaps that you've seen. But the whole time I kept thinking about AI governance and how you want to be sure that you have a good representation. You want to be sure that the outputs are accurate and reflective of the different perspectives and diversity in society.

[17:57] And you also want to be, when you're governing AI, you want to be sure that there's no biases. As you were talking, you said all that like, but you were talking about this here situation, so that's good.

[18:11] And then you talked about that when we went back to the CAD days.

[18:16] You have these tools, the tools need to be accessible, then we need to be able to use them and you need to be able to understand how to use them and what the outputs are and can do for us.

[18:27] But we don't need to be afraid that it's taking our jobs like we're dealing with this today. And from a governance perspective, we're trying to help people understand that AI is not going to take your job to support you and assist, that the humans must always be in the loop.

[18:44] You said that a few times. And back to that. And those are governance fundamentals that we have to instill. And the governance. We're saying that the governance needs to evolve to support the emergence of technology.

[18:58] That's true. The government needs to evolve. But some of these things have been there all along.

[19:03] Dr. Lisa Kinnard: I will. I also would like to add that when I started off in medical imaging in 1992, back then, what was happening at the time was it was when hospitals were entering the digitization age.

[19:17] So it was going from paper to digital. And so it was pacs, Computer Aided Computing Systems.

[19:23] So PACS is normal now. You go into a hospital and everything is digital. But it was not like that 30 years ago. And so one of the things that. One of the other AI issues that can be helpful is rural hospitals.

[19:35] Right? So you have rural hospitals that are not. They're not like Johns Hopkins where they just have everything. They're small. They may have one radiologist there. And so this is another way that these can help because that you could have someone else somewhere else, somebody at Hopkins helping read these things and the AI helping as well.

[19:53] So small hospitals, hospitals in communities of color where that don't have enough resource. So this is another issue because you have a lot. Even in D.C. alone, all of the Southern hospital in Southeast has closed.

[20:07] So all those people come to Howard. And so you have Providence Hospital, which is closed. So now all of those people are coming to how. I mean, and so there's.

[20:14] There's not enough doctors to manage all of that. So that's another way that it can be helpful. As these hospitals are closing and as they don't have the resources that they.

[20:24] Pamela Isom: Need, are we getting the training that we need to use these types of tools? Are hospitals getting it?

[20:31] Dr. Lisa Kinnard: I mean, there. There is training that comes with it. And. But, you know, again, we as an agency, we have to make sure that the manuals make sense. The manuals are very important because the manuals have to talk about not only how to use the device, but how it's been tested, who has been tested.

[20:47] These were the results. All of that is important. We're very sticklers about that. So. Yeah.

[20:52] Pamela Isom: Okay, so let me shift gears a little bit. So we talked also about math and science. And I know some mathematicians in communities of color. I do know some mathematicians. I know some professors, things like that, But I do think, and I have a question for you.

[21:09] I think there's a gap in math and science, but I'd like to get your perspectives on is there a gap in math and science studies, in careers in communities of colors?

[21:19] And if so, what do you think are some things that we could do to address that if it's the case and why is it the case? What's your thoughts?

[21:28] Dr. Lisa Kinnard: There is absolutely. Unfortunately, still, even though I just, I just realized the other day that I have now had my PhD for 21 years now. And I didn't realize that it had been so long, but I was like, oh my goodness, 21 years.

[21:45] But the numbers don't change and the numbers haven't changed. And so I think that some of that is that Americans are afraid of math. I mean, if you look at the companies and stuff, there are lots of people from Asian countries, lots of people from, you know, India and Pakistan, lots of people from African countries, you know, lots of people from the Caribbean, you know, but, you know, African Americans and just, just Americans, period.

[22:11] That is a small group, you know, and that, that has been my whole career working in this, this is how it's been, you know, and so, and so I think that is that in, in this country, period, math is something to be feared.

[22:24] People are afraid of it. The numbers in the African community, African American community, are even smaller because a lot of us are not in those private schools that have these very rigorous kind of work.

[22:37] A lot of us too get pushed into. You know, when I was a kid, they used to call them the skills classes. So a kid may just have be a hyperactive kid.

[22:46] If you're black, they're going to push you in the skills. So they get further and further behind starting from elementary school. I was looking at some statistics in D.C. you know, the other day that there are something like 60% of black kids are not on grade level for reading and math.

[23:06] I mean, it's in certain wards of around D.C. i mean, it's absolutely ridiculous. They are not on grade level in the third grade. Okay? So if it starts from there, then there's no way that we can get to undergrad in engineering and math and all of the things that you name.

[23:23] And so I think that that's where, that's where the gap begins. It begins from elementary school. I mean, and then I think also there are other things. There are other brighter, shinier things.

[23:36] Sports.

[23:37] Everybody thinks that they're going to be Michael Jordan or they're going to be, you know, this big Football player. And so. And so there's a lot of people kind of pushing people in sports because I think to them that's like instant money.

[23:50] And so if we don't get them on grade level to begin with, this is what's going to happen. We're going to lose them to sports, we're going to lose them to other things.

[23:58] We're going to lose them to entertainment, singing, rapping, all of these things. And it's not that these things are bad, but we can't all be basketball players. But I think we have got to work with them from the beginning so that they can actually do the work.

[24:14] Pamela Isom: You mentioned fear. I know we want to start early, and I can appreciate that. And we'll talk later on some things that we can do ourselves to help with that. But we had talked about fear.

[24:28] What kind of fear? What do you think we're afraid of when it comes to math? Is it just as complicated? Is it the numbers? What's your take on that?

[24:37] Dr. Lisa Kinnard: Well, I think for a lot of people, it's probably the black and white nature of things. It's either right or wrong.

[24:43] And I think that people want to have. They want to do things that are more nuanced, where they can, you know, where they. They want shades of gray. I think that that might be one thing.

[24:52] And then for another thing, I was talking to my nephew recently who used to love math.

[24:58] He used to be very, very good at math. He's so good at math. But he. But he was kind of exceptional when he was younger. And so he's now 16.

[25:07] And I asked him, I said, give her. You used to love math. What happened? He said, yeah, until it got hard. I mean, and I. And I think that's just not specific to him.

[25:15] I think that's a lot of kids, when things get hard, they start to not like that. So I think it's the challenge.

[25:23] I was different in that.

[25:25] I think all my life I've always loved things that were challenging. I've always run in that direction.

[25:32] And so if I got a B in a class and I could have gotten an A in another class, if I could be in that honors class, I was going to be in that honors class because I just wanted the challenge.

[25:42] And so I think that. I think that people just run from that. I think people run from challenges. I've seen it happen in engineering where people, they start off and then at a certain point they say, this is.

[25:53] This is interfering with my social life too much. I don't want to have to carry these heavy books, this heavy physics book and this calculus book up the hill every day with me.

[26:01] And then I think there's also the other issue is that there are certain fields like law that are more attractive because you simply go to the right law school or you go to one of those top firms, you can make quite a bit of money right off the bat.

[26:15] And so. And I think that in engineering it is very hard work, but you may not be financially compensated as well as you would in other fields.

[26:23] Pamela Isom: And you mentioned math and science. I agree with you. I feel like that we need more opportunities. So we want to catch them in elementary. Right. Even pre K. Absolutely. Interested in it.

[26:37] And I find that our little bitty ones, they love computers, right? They love computers. They like born working with tablets, you know what I mean? So they love that. And so we have to look at how to get them to start applying these concepts early, naturally.

[26:58] So. Right. So and then I think we just need the opportunity. So sometimes we just don't have the opportunity. Let's say it gets challenging. We need good teachers.

[27:07] Dr. Lisa Kinnard: Correct.

[27:08] Pamela Isom: Good guidance. And we need those to help us to be persistent, to persist.

[27:15] I think that there is a gap and I think there is an opportunity that we can address and that we're going to have to come together to address.

[27:23] Dr. Lisa Kinnard: I'm a member of the, of the links and I was. So we work with the services to you youth group and we have a NSBE Junior, you know, National Society of Black Engineers junior chapter at, at a school in Southeast.

[27:35] And so, and so we worked with a set of boys last year. We're going to be working with those same boys this year. And I. And so when I read off those statistics about the Black kids in D.C.

[27:47] and the, and the disparity, you know, there was a collective gasp in the room. You know, so, and so there are a lot of educated adults that don't know about these disparities.

[27:57] I mean that's something. If we get the word out there, then they're going to understand because I think we said a lot, oh, we need more people. But when you give them actual numbers and show them a chart of the difference, yeah, they're, they're in shock.

[28:10] Pamela Isom: I agree. And I think we need to show them those that are in the fields and doing well. This week I was at this meeting, it was communities of color, African American, male, women, other nationalities were together and we were discussing GIS and AI, but mostly they were discussing gis and how we can use tools like that, which is science Right.

[28:39] We were over at your alma mater. So we were over at Howard and we were in a discussion about this and how we want to see more communities taking advantage of these types of tools because that is where technology is headed.

[28:54] You need maps, you need geological data, you even need it for cyber defense. You need to understand that geological geolocation and all that kind of data. We were talking about how technology is emerging and they made it a point to have some of the younger people talking about what they're doing and how they're using the tools.

[29:17] And it was just fascinating to hear them express their excitement about these tools and how it's helping them to understand what's going on in their communities based on the data and how they were able to pull the data together to maps and into different types of images.

[29:34] So that's science. So that's math and that's science. Yeah. So we have our, as one strategy as a, I'm an engineer like you, so I'm like always trying to figure out what we can do to solve some of these problems.

[29:46] And so if we could just take our itty bitties like the pre K and the kindergartners and the second third graders and let them see that young people using some of these tools and using AI and demonstrating how.

[30:02] Because behind all this mapping is AI. Show them what, how it can be used to detect anomalies in a, a radiology chart. Show them how it happened. Then they'll get excited.

[30:16] Yeah, because I mean, that's more exciting probably to some. It was more exciting to the gentleman I was talking to the other day than basketball. I mean, I appreciate it.

[30:24] All right, but they just need to know what are some, what does science mean? What does math mean? That's math. That's a glide, man.

[30:34] Dr. Lisa Kinnard: And this math is in so many things. And so one, one of the other things that, that I do with our kids is that in Black History Month we talk about blacks in science.

[30:44] So I always, when I'm working with these kids, they get a book. When I was working, I was working with some nine year old girls several years ago, I think it was, it was back in 2017.

[30:53] And so we had them make the, we gave them the pictures of the people, we gave them a little summary. And so each one of our members would help them make these posters.

[31:04] They decorated the posters and they made them look real pretty. We put them on the wall in the school and they left them up for the rest of the school year.

[31:11] But what was amazing was that again the adults learned so much from those posters. Every time I do this black history thing, I learned so much. Is that the reason that we have Otis elevators and every elevator we're in is because of the black man named Otis.

[31:29] I mean, and things like the traffic light and all of this. We have. There's a lot of black science history that people don't know. A lot of people don't know.

[31:42] Pamela Isom: And what that is. That's a way coaching and teaching us early about what the possibilities are like what you're saying. And also how these math and science and just STEM at large can be applied in areas where we may think not.

[32:00] And our young people are. They not only are like born with the tablets, they are. And my granddaughter the other day was like showing me how to water plants and she was just picking up images and dropping them here.

[32:17] And I was like, okay, okay, yeah, I'm getting with the program.

[32:21] Dr. Lisa Kinnard: Yeah, yeah, we. I was telling my husband the other day, I said, we not like his. He gets so frustrated because he deals with the, the college kids and so he's upset with the obsession with the phone.

[32:34] And I said, but what they need to know is that is not just to touch things, but they need to know what, what's inside the phone.

[32:42] Yeah, what's inside the phone and why this is a whole computer in your pocket. I mean, and what is happening. And so that's the kind of stuff in undergrad that I learned about a very large scale integration circuit.

[32:54] And so they need to understand the stuff inside the phone.

[32:58] The reason why, I mean, phones are taking such good pictures now is because of sensors, is that the sensors have gotten so much better. And so, so that's, and so that, that's what they, that's what we need to get into what's inside the device, not just how do we use it.

[33:13] Pamela Isom: That's good. So. Yeah, so I agree then. So we've got this gap, but we got some ideas on how we can help to address it and how we can communicate. And so we will just get there.

[33:27] I mean, we definitely need the opportunities and so we'll get there with that as well. So as we wrap, first of all, is there anything else you wanted to talk about before you share?

[33:40] I need you to either share words of wisdom or your call to action or both. But before doing so, is there anything else? Did I miss anything that we wanted to talk about?

[33:50] You.

[33:50] Dr. Lisa Kinnard: You. You did not. I, I will say this is. This has been just delightful. It has just been so much fun. I, I so appreciate you inviting me Here. But as far as words of wisdom, I would say that the people who make it through engineering and science and math, they are not always necessarily the so called smartest in the room, but they're the toughest in the room.

[34:17] And that was true your first day of college and tell you that half of you were going to be here at graduation. And that was true. But there's some people who are, yeah, they are naturally born with these mathematics skills.

[34:27] I was not one of those people. I was an artist. I was a dancer as a child. I'm still an artist. I'm a photographer. So I equally stand in two worlds between art and science.

[34:36] And so, so what I had to do was that I liked math. I loved it, but I had to work at it. And so I've gotten where I am because I worked, not because I just got instant straight A's.

[34:48] Yeah, I was not that person. And so the call to action would be, as we talked about it, you know, we have got to make sure that our kids are not just at grade level, but above grade level, you know, on these things.

[35:02] So we have got to, we cannot let them fall through the cracks. You know, we cannot let, you know, these parents cannot let their kids. My mother studied early childhood education and so what they would do is track the black kids out of the, of the higher classes or whatever.

[35:16] And that's still going on today. So we've got to get in these schools and work with these parents and make sure that their kid is at or above grade level and start, as you said, from pre K.

[35:28] That's the call to action.

[35:29] Pamela Isom: So we have what work to do. We have work to do and we have to figure out where we're going to apply that work.

[35:38] Yeah. So. All right, well, I thank you for being here and it has been a delight talking to you as well.