[00:00:00] Stephanie Wierwille: Welcome to the No Normal Show, brought to you by BPDA marketing services firm that delivers the future to healthcare’s leading brands. This show is where we leave all things status quo, traditional old school, and boring in the dust. And we said we celebrate the new, the powerful, the innovative, the bold, while delivering the future to healthcare’s leading brands. I’m Stephanie Rear, EBP of Engagement here at BPD, and I’m joined by Chris Bevelo, chief Transformation Officer. Good morning, Chris.
[00:00:30] Chris: Good morning. Looking forward to our, our weighty discussion today.
[00:00:34] Stephanie Wierwille: Weighty meaty.
[00:00:36] Chris: to it.
[00:00:37] Stephanie Wierwille: Um, and Hi Dez. Uh, Desiree Duncan is here as well, VP of Health Equity and Inclusion. Thanks for joining Dez.
[00:00:45] Desiree Duncan: Hi. Excited to be here and talk about all the things meaty. meat.
[00:00:51] Stephanie Wierwille: Yes. Do you, do you, I always say. Also vegetable heavy, if you prefer that. ’cause I’m a broccoli girl, but you know.[00:01:00]
[00:01:00] Desiree Duncan: We’re definitely
[00:01:01] Chris: This is.
[00:01:02] Desiree Duncan: to all of the sweet talk of this topic that’s been
[00:01:06] Chris: Yes, this is the AI broccoli talk a hundred percent. We have to eat our broccoli today.
[00:01:13] Stephanie Wierwille: Yay. Oh my gosh, what a good day. Um, okay, so. Just first up a little agenda setting so that everybody knows what we mean when we say AI broccoli. What are y’all talking about? So we have a few, few fun headlines before we get into this meaty broccoli soup. Oh, yuck.
[00:01:33] Chris: What?
[00:01:33] Stephanie Wierwille: all
[00:01:33] Chris: What?
[00:01:35] Stephanie Wierwille: is
[00:01:35] Chris: That’s more of a stew, really not a soup.
[00:01:38] Stephanie Wierwille: God. Um, help. Okay, let me bring it back. Taylor Swift is engaged, so there’s a, there’s a comeback. Um, everybody knows this. The whole world knows this. All 8 billion plus people know this. But we’re gonna quickly hit on the fact that brands have been jumping in left and right on this new news. also have, uh, a [00:02:00] really fun headline that we saw around a new reality show coming out that is.
All about brand building. I think brands will get their time in the sun. Finally, in the reality world, brought to you by Jimmy Fallon and Bozeman St. John, who I just adore. And then our real topic as we’re hinting at, um, we’re kind of calling it co cognitive dissonance, but we’ve been talking about AI a lot on the show.
If you’re a long time listener, we’ve been talking about it for. Probably two plus years. We’re a little obsessed, all three of us, and we’ve been talking a lot about the really good things that are happening, right? All the opportunities with ai, how you can be thinking about it from a marketing communication standpoint, but we’ve only sort of scratched the surface on the concerns, the risks, the challenges. And so that’s what this episode is all about is, um, just digging right into the scary stuff, digging right into what could happen and what do we need to be prepared for. So the topic, um, of the day. First, a few quick plugs. So, um, if you can’t get enough of the no normal show, [00:03:00] please subscribe to the No Normal Rewind.
It’s where you can get all the sources for today’s discussion. There’s gonna be quite a lot ’cause we’re citing a variety of. Studies and, um, experts and, um, all the talking heads. And so we wanna make sure that we’re linking out to those so you can find it in the recap. Uh, secondly, we mentioned this last week, but we have two new blog posts out.
One is called the Einstein Divide, which find out what that means on our latest show. Um, but you can read the blog, which is all around, uh, how to kind of rethink your marketing function in an AI world. And the second blog post is you’re full, but are you growing? Um. How, how are you thinking about growth as a health system?
So you can find those on bpd healthcare.com. And then lastly, uh, catch us at the Illinois Society for Healthcare Marketing and Public Relations Fall Conference. Coming up very soon on September 18th, our very own Kate Cavanaugh, VP of communications at BPD is gonna be hosting a session on our report, which is called Tomorrow is Too Late.
Time is running out for health [00:04:00] systems to validate their value. So really, really important session here. Um, if you’re going, if you know someone that’s going, uh, please come find Kate and, uh, join her for this chat. For more information, you can find that link in the show notes or in our newsletter or on our website, or reach out to us. Okay. I think that’s all the items. Um, we have one more, uh, just really important FYI for everybody. Um, and that’s related to Yeah. Yeah. Our, our amazing friend and long-term co-host here, Dez, who we love and adore and who brings all the spice and all the brilliance. Um, is moving on to a new chapter post no normal. So the post, no normal era for Dez. Um, really, really exciting. Dez, we’re so excited for you. We’re super sad that we won’t have your voice on here every week, of course. But, um, we know you’ll, we’ll be hearing and seeing your voice in healthcare more broadly, and I’m just really excited for all the great change you’re gonna be making in the world.[00:05:00]
[00:05:01] Desiree Duncan: Thank you, bittersweet, definitely as I have a great deal. I mean, especially from the both of you, but just being able to get to sit beside you every single week on this. I will miss. But we’ll be listening in to make sure that I’m up to speed. Um, but yes, all of the crying emojis.
[00:05:22] Chris: Yes, many crying emojis, but also happy emojis.
[00:05:25] Desiree Duncan: Yes,
[00:05:26] Chris: Happy emojis for you. Yes. So this is the last show with Des, not the last show. I think it’s safe to say that there will be you, we will have you back in some capacity and we will channel you probably in every show we will channel. What would, what would Des do?
What would she say about this?
[00:05:43] Desiree Duncan: Something unhinged, I’m sure.
[00:05:46] Chris: Probably yes, but we need to make sure we keep that, keep the unhinged ness, if that’s the word. Keep that spice in here. So, yeah.
[00:05:54] Stephanie Wierwille: Yeah. Yeah.
[00:05:56] Chris: well.
[00:05:56] Stephanie Wierwille: face. Um, but yes, also happy face for, for your [00:06:00] next chapter. So does, um, just so grateful for you, grateful for your voice, grateful for your friendship. Um. excited for your next chapter. Um, okay. Uh, we also just wanted to very quickly give a chuckle on the fact that last week we talked a lot about Cracker Barrel and their new logo.
And since we recorded, I think it was like an hour after we recorded, of course, cracker. Yeah. A actually Cracker Barrel took back the old logo, jk. JK Psych. So we just wanted to say that if you listen to our last episode, um, everything changed.
[00:06:35] Chris: Actually this just in, they’re changing it back to the new logo now and they’re calling it HBO Go Cracker Barrel. They’re falling in the same branding path as HBO has gone back and forth on their logos. So, um, who knows, by the time this drops, maybe they’ll have gone back to the original. Before the change First logo with, what’s his name?
Uncle Harvey.[00:07:00]
[00:07:01] Desiree Duncan: Herschel,
[00:07:02] Chris: Sorry, I’m sorry.
[00:07:04] Desiree Duncan: knows that
[00:07:06] Stephanie Wierwille: if you say it loud enough, Chris, yeah, I think they’ll probably do it if you say it loud enough. So they’re very easily
[00:07:12] Chris: No, I don’t, I don’t think I can command that sort of power for Cracker Barrel, so
[00:07:18] Stephanie Wierwille: okay.
[00:07:18] Chris: good luck Cracker Barrel.
[00:07:20] Stephanie Wierwille: Yeah. All right. So we’ll get into our headlines. We’re gonna keep ’em quick because we had got a big topic to discuss, but, um, we just had to touch on the other viral thing. I already mentioned what it was. You know, the whole world knows Taylor Swift. Travis Kelsey, they’re engaged. Your English teacher, your gym teacher, it’s happening. Um. You couldn’t open a social media feed this last week without seeing this and without seeing all the brands that jumped in. So everybody from OP to Lego to Krispy Kreme to, you know, American Eagle. I don’t know what they’re saying. Um, but Starbucks and South everybody had to say something. Um, Des Chris, what, what, what did you see in your [00:08:00] feeds and what was your take on the gym English approach?
[00:08:04] Desiree Duncan: I, I don’t think I’m on the Swifty talk, so
[00:08:07] Chris: Yeah.
[00:08:08] Desiree Duncan: like, oh, this is happening.
[00:08:11] Chris: I thought on the news, my news feeds not on my socials. Really? I’ve seen it since then, but, um, yeah, it was just news. It was like breaking news. Um, and it was interesting to me because it was a, there’s been lots of breaking news. Say in that voice over time. Um, and usually it’s just like a grenade going off for all kinds of reasons in the world, in the country, whatever.
Um, and so this was like a, a grenade of flowers. Is that a thing That doesn’t seem to make sense. It was like a positive. Grenade. I, I need this to get away from grenade. That’s probably a terrible metaphor in all the ways, um, when it hit, it just felt like, oh, something happy. We need more of that. I’m not sure everybody sees it that way, but I think actually most people seem to be [00:09:00] trying to embrace it in a happy way.
Um, so yeah, kudos. Kudos, and I think it’s great. Good for them.
[00:09:07] Stephanie Wierwille: Yeah, yeah. It was a nice, happy moment. I think that, um, I have two, I have two thoughts on it. Number one, the reach was just. Uh, to your point, like it was one of the only things I’ve seen in the, I can’t remember since when, when like everybody was excited. Maybe Dolly Parton’s the only other one that commands that kind of energy, but I saw an estimate that perhaps it may have reached over a billion people.
It got 35 plus million likes already, and, um, that’s an eighth of the world that’s somehow leaned into this conversation. That’s just
[00:09:36] Chris: Wait, we have 8 billion people.
[00:09:38] Stephanie Wierwille: Yes,
[00:09:39] Chris: billion now.
[00:09:40] Stephanie Wierwille: it,
[00:09:40] Chris: Oh my gosh.
[00:09:42] Stephanie Wierwille: this number. Yes. Um,
[00:09:45] Chris: like by a billion every few years. It’s no good.
[00:09:48] Stephanie Wierwille: it’s
[00:09:48] Chris: Anyway, we’ll get to that later.
[00:09:50] Stephanie Wierwille: is now negative, but that’s another topic. Um, and then from a brand standpoint, I liked what our friend Dr.
Marcus Collins had to say.
[00:09:57] Chris: I.
[00:09:57] Stephanie Wierwille: was on some news channel, CNN, I don’t wanna quote [00:10:00] that exactly, but one of the news channels commenting on how, why do brands need to jump in on this? And I just thought that was a good take.
[00:10:08] Chris: Wait, was he saying they should, or was he questioning.
[00:10:10] Stephanie Wierwille: was like, come on. Like you really have to have something to say?
[00:10:15] Desiree Duncan: It’s a viral moment, opportunity, you know, jump on the bag again of everything. It’s almost like using the, uh, the sound, the trending sound. It’s like the same thing. You, you like jack, but, oh well,
[00:10:28] Chris: I don’t know. I kind of agree, like if you’re, if it makes sense and you can do it in a way that feels natural to your brand, I don’t see the harm, but I think in most cases it’s gonna come off am handed. Is that a right, right term? I don’t, I wonder what the derivation of that term as No hand handed.
That’s a real thing.
[00:10:49] Stephanie Wierwille: Well, that
[00:10:50] Chris: Imagine walking around with hands on your, like hands and you’re, you can’t do anything. Well.
[00:10:56] Desiree Duncan: Next
[00:10:56] Stephanie Wierwille: Okay. All right. Um, [00:11:00] okay. Keep this one pretty quick, but I just, another exciting and happy moment for the marketing world ’cause we need those. Um, on Brandand with Jimmy Fallon is coming out soon, September 30th with Marketing leader Bose Imma St. John and also Housewife, um, real Housewives.
Um, now, but anyway, BMO is so much more than that. She is just absolute. Awesomeness anyway. Um, so sit back, be your armchair marketer, and watch everyday people come up with brand ideas. It’s gonna be fun.
[00:11:31] Desiree Duncan: I am so pumped for this. This is our like, top chef. Uh, I can’t, I mean, I hopefully it’s good. Hopefully it’s not like a water deck. On by Jimmy Fallon, like giggling all the time.
[00:11:42] Chris: Yes.
[00:11:43] Desiree Duncan: Um, but this, it, the potential is actually kind of exciting. Um, ’cause I, I spend most of the times, like for example, when we’re at like games or something, like when we’re at WNBA games, uh, with my partner, I’m usually like, what’s the brand strategy for the WNBA?
Like, totally geeking out, not paying attention to the [00:12:00] game. So very excited to force her to watch this.
[00:12:04] Chris: I like the concept of it, but are you guys familiar with, um, carless Carless, um, seltzers or vodka drinks?
[00:12:14] Desiree Duncan: Oh
[00:12:15] Chris: It’s, it’s spelled like carb bliss, like it’s bliss, but it’s basically like carb bliss. So this goes back to like zero carbs and this alcoholic drink. Um, Jimmy Fallon is the carb bliss of comedians, um, because basically like you drink it and you’re like, uh, that isn’t too sweet.
Like, uh, like artificial sweetener. So he really grates me. Um, so I don’t know whether I’m gonna be able to deal with his giggling, as does put it. I’m sure that’s part of the attraction to many people, but it doesn’t need to be a serious show. I think it makes sense to have some fun with it, but I don’t know that I can stand his artificial sweetener brand of comedy.
[00:12:57] Desiree Duncan: Wow.
[00:12:57] Chris: So I’ll leave it there. I
[00:12:59] Stephanie Wierwille: Hot. Yeah,
[00:12:59] Chris: [00:13:00] 2 cent Nation.
[00:13:01] Stephanie Wierwille: Yeah. Well, I will say Bose, Bose will, uh, play a nice, um, balance to that because she’s got a lot of depth and knowledge, and
[00:13:10] Chris: I don’t even know who that is. You’re gonna have to send me a picture or something.
[00:13:14] Stephanie Wierwille: Oh.
[00:13:15] Chris: I do, and I just don’t know that I know her,
[00:13:18] Stephanie Wierwille: Big fan girl
[00:13:19] Chris: so. Okay.
[00:13:21] Stephanie Wierwille: Um, that’s your homework, Chris. Uh, I’ll send you her book. Alright, so we’re moving on to our main topic. Um. Like I said, ai, cognitive dissonance. So yes, we’re all three very excited. Yes, we believe that AI is the future for marketing communications. Yes, we believe that all, all chief marketing officers should be leaning in and prioritizing AI at the same time. Um, over the course of us discussing this topic, there’s been so many things that have popped up around. You know, the environmental impact, the job potential, job risk, which I think we did a whole episode on that. Corporate risk, legal risk data, privacy [00:14:00] risk, societal challenges, misinformation. I mean, the list goes on.
We actually put out a list. We, we plotted out a list and it was really long prep for this episode. So, um, in the spirit of trying to organize it. And we’re gonna go through this topic in kind of four broad categories. So I’ll just set those up and then we can dig in. the first category is what’s the individual impact, um, to you as a individual human around ai, short term, long term.
Second is corporate and organizational risk, especially important for our listeners to be considering. Third is. Environmental impact. We had to break that out as its own category ’cause it’s just so huge. And then fourth is the broad societal challenges. So this, some of this stuff will be dent, some of it will be scary.
Some of it will be a little bit of our opinions too. Um. Um, my, my little joke here is it’s pumpkin spice season and Halloween is already in all the retailers, so we’re just leaning into the spookiness. There’s gonna be a little spook, um, in some of this discussion, but we really wanna dig into it, uh, [00:15:00] because we can’t ignore it as we think about the future. Anything you all would add before we get into our
[00:15:07] Chris: I am ready to eat some broccoli.
[00:15:10] Desiree Duncan: Serving up all the vegetables today.
[00:15:12] Stephanie Wierwille: Yeah. All right. Okay, so I’m just gonna set up this first category and then we’ll just, we’ll get into it. So the first bucket here is the individual impact. What’s the impact to individuals? So this is things like, how, how do individuals start thinking about what’s the future of jobs and employment? Um, we’ve got some stats we’ll throw out, uh, and, and I’ll come back and go through those. Um, the second kind of subcategory here is. We’ve seen things like isolation start to pop up around how people are leaning into generative AI and maybe even having an unnatural connection to their AI bots in various ways. There’s also data showing that it can lead to cognitive decline if you start outsourcing all of your cognitive tasks to ai. Um, of course, personal data privacy, so those are just some. Of the, [00:16:00] um, types of, um, implications, but I’m sure there are, there are many, many more. Um, so let me just throw out a few quick stats and then I wanna hear you all’s take. Um, so on the jobs and employment side, I think the jury’s a little bit still out on what’s gonna really happen with employment, but some early data has shown that. In AI affected roles like software development, customer service, there have been declines somewhere between 13 and 20%. That’s data from Stanford. Um, a recent economic modeling shows that we could see unemployment actually double in the coming years. Um, so those are just some examples, but I’ll pause here on the jobs and employment front. Knowing that we’ve, we’ve talked about this a little bit, um, but de and Chris, what would you all add in terms of individual impact?
[00:16:49] Desiree Duncan: Yeah, for me, I’m thinking about like, my little cousin just graduated from college, you know, she’s trying to get into law school. But then it’s looking at like what is actually the future of law school and you know, what is the [00:17:00] future of that entry level job? ’cause we’re hearing notions around AI being able to do some of that entry level work.
So then, you know, how are young folks getting into the workforce getting that experience right? Or how do you go about getting hired if you know, entry level isn’t the. The thing, or excuse me, more about how do you get experience? Um, that’s kind of where my mind goes and yeah, I’m, it’s wild to see some of the, uh, storylines from movies and shows actually come through life.
But Chris, your thoughts
[00:17:31] Chris: Yeah, I mean that’s the filter I used to does. I mean, when I think about all the things we’re gonna talk about, I think about my kids and I, my middle daughter, uh, is graduating in December with a degree roughly in digital marketing. And I had a conversation with her like, how much are you learning about ai?
She’s like, we haven’t learned anything about ai. I was like, oh, oh boy. Um, and I started to kind of share with her like, you need to do this. And she was like, no, AI is horrible. So we’re gonna get to like the [00:18:00] other side of it and some of the other broccoli eating we’re gonna do here. And all of her points were completely valid as we’ll talk about.
That’s why there’s cognitive dissonance. Um. But we did have lunch around it. And I said, look, I don’t disagree with anything you’re saying, and I actually support your point of view on all this. But if you want a position in this space, I don’t know how you, I don’t know how you don’t somehow learn to leverage ai.
So it, it is, um, imagine or your, your, your niece. Is that who it was Des? Um.
[00:18:36] Desiree Duncan: little cousin.
[00:18:37] Chris: Your cousin. Yeah. So law school’s expensive. You’re committing a lot of money to go to law school and imagine getting outta that and being like, yeah, no, we don’t need entry level lawyers. Or we don’t need, like, we used to need 50 lawyers to run this business and now we need 10.
Um, so yes, I think that generation, that’s what I think about. ’cause that’s where my [00:19:00] kids are. So, um, I know we talked a lot about jobs, so it’s probably one we shouldn’t spend as much time on, but it does set up all the other things that we’re gonna talk about too.
[00:19:09] Desiree Duncan: I mean the cognitive decline, like the, the critical thinking skills that we have now, I’d be curious just to see what does that actually look like in a few years? Am I actually able, still gonna be able to connect the dots, uh, on things? Or am I gonna have to use AI for everything? So it’s like, how do I stay sharp with that tool? Um, yeah, that’s, that’s probably my big concern as well.
[00:19:31] Stephanie Wierwille: Yeah, and I’ll just add one more thing, um, on the jobs front and then we’ll move to kind of the, the mental health aspect, cognitive decline, isolation, all that good stuff, emotional reactions. Um, I think, Chris, to your point around, you know, entry level folks, I saw this term digital Darwinism, which that made a lot of sense to me, which is those Yeah.
Yikes. Um, it’s, it’s referring to the fact that, you know, those who are, who grow AI savvy versus those who don’t. It’s kind of this, this [00:20:00] decline, um, or divide, I guess I should say, um, between the two. not the best term for it. Um, okay. So yeah, you brought up des the cognitive decline. I think there’s so much around, not just this, but just the emotional reaction.
We talked, you know, the other day about how folks are developing an unnatural connection to ai relying on it. But like cognitively, but also emotionally. Um, so let me just toss that out to you all and see what you’re thinking about there.
[00:20:30] Desiree Duncan: Yeah, I mean. Oh, go ahead Chris.
[00:20:33] Chris: You first please After me.
[00:20:37] Desiree Duncan: I mean, as a Thera having a therapist in the house, you know, the conversations around people using their AI as their like personal therapist, like, you know, good or bad. But it’s also with that original, that chat GT four or was very sick of Fantic, it was just telling you all the things that you needed to hear, well, not needed, wanted to hear, uh, versus that kind of going boy, which had five, [00:21:00] um.
But yeah, very, very concerning. Uh, what that’s gonna do for the people’s mental and emotional states.
[00:21:08] Chris: and we just saw a lawsuit, uh, last week, I think, for parents, um, against, uh, open AI because their son committed suicide that they believe was induced by ai. Um, and my wife, who’s a, a therapist, a clinical therapist, um, shared with me a few weeks ago that one of the hottest topics in therapy circles is AI induced psychosis.
Um, and how they’re starting to see that come through. Um, and we’ve seen public examples of that. Um, and I can’t remember the name and I can, and I don’t wanna get it wrong, but including a very high level respected venture capitalists in Silicon Valley who like publicly went a little sideways and had friends and coworkers really worried about his state.
Uh, from AI into psychosis. So all that stuff is real, probably growing [00:22:00] the, we
[00:22:04] Desiree Duncan: Yeah, watching the,
[00:22:06] Chris: I’m breaking up.
[00:22:07] Desiree Duncan: LinkedIn. I’m breaking up
[00:22:10] Stephanie Wierwille: Yeah. Yeah. We, we caught the gist, but your last sentence broke up a little bit. Yeah. Maybe repeat that.
[00:22:16] Chris: Sorry. Repeat the whole thought. Which thought? The
[00:22:20] Desiree Duncan: just the LinkedIn
[00:22:21] Chris: therapy thought. Oh,
[00:22:24] Desiree Duncan: I.
[00:22:25] Chris: and then we’ve, we’ve even seen public examples of this, for example, a very high level, well-known venture capitalists in Silicon Valley who went all sideways publicly and had friends and coworkers, um, expressing concern about AI induced psychosis.
So even those folks that are, you know, this isn’t just people that are, you know. Out in their garage and they’re unemployed and blah, blah, blah. Some of the smartest people out there could fall prey to this. So real thing
[00:22:58] Stephanie Wierwille: Yeah, I saw, I think it was just [00:23:00] this past week, I believe that Sam Altman, um, kind of acknowledged that OpenAI recognizes that. Challenges, and I think that that maybe came out a little bit around the, the upgrade from four to five, where they did, you know, admit that there was sycophantic, um, types of And it’s just really reinforcing whatever you say it’s like, yeah, that’s a great idea no matter how bad your idea is. Um, and and so what he said recently was, yes, we know that it’s a problem. But then he sort of said, but less than 1%, way less than 1% of chat GBT users are using it in a mentally unsafe way. But then in listening to our friend Paul RA’s episode this past week, um, on marketing AI Institute’s AI show, he was doing the math and saying, okay, but 700 million monthly users of chat, GBT, that’s 7 million people that are using Chacha piti in mentally unstable ways. Like, should we not. Really quantify this and say, there is a real problem here.
And that just blew my mind a little bit. [00:24:00] I’m, I mean, I think I’ve seen it in my own use and you know, very much recognizing sometimes when I go down a rabbit hole and it just reinforces it. But then you can think, depending on where you are and you know, in your life, that could be very challenging.
[00:24:15] Chris: I know we got a bunch of other stuff to talk about, but the fear to me is that, um, it is dismissed in a way you just talked about, like, oh, you know, like. Chat. GPT doesn’t drive people crazy. People drive people crazy kind of thinking like, oh, that’s just a, that’s the person’s problem, not the tool. They’re just using the tool wrong.
Um, and it’ll just be brushed aside and we’ll just keep moving forward in the pursuit of, you know, a GI and profits and all the other things. So that would be the fear in this area.
[00:24:46] Stephanie Wierwille: It feels like, and we’ll tie in education later on, but it feels like this is where there’s some real education needed in K through 12 even around this. Okay. Let’s move to our next broad bucket here, which is corporate and organizational risk. We could go on forever about this one [00:25:00] too, but I’ll just kind of give some some of what we mean by this.
So we’re talking about, you know, what’s the risk or the threat to organizations, short term and long term. It can be everything from legal and regulatory exposure, which of course in the healthcare space. Really important. And, you know, many health systems, that’s why they’re kind of leaning into co-pilot versus open ai.
But does co-pilot even, you know, what are the risks there, right? So it can be that tactical, um, it can be, uh, you know, IP and copyright, chaos lines are still really blurry there. Thinking about the reputation and trust of an organization. Last week we covered, you know, union rallies happening around ai.
That’s. One example, or it could be the long-term organizational viability, which, okay, that’s deal. Um, but you know, who’s gonna be the winners and the losers. So I’ll just pause here. Um, I’ll throw it to you first. Chris. Um, what are you thinking about in, in this, in this area?
[00:25:54] Chris: Um, I mean, honestly, this maybe sounds terrible, but it’s of the areas we’re talking about, the one I spend the least amount of time [00:26:00] thinking about, um, I’m not worried about corporations. I mean, I just feel like they’re going to figure it out. Of course, there’s risks and I think if we talk about the risks, I.
To our, our audience. That’s fair game to talk about. I think we do a good job of covering that when we talk about ai, uh, the privacy issues, the, you know, hey, plug something in and you come up with the best idea in the world. It’s not, you can’t trademark that. All those kind of things. Um, I think when we talk about how to use AI and the promise of ai, we covered those.
Uh, so I don’t have too much to add, uh, because that’s our world. So, you know, I think it’s real. But honestly, of the four, it is the, uh, the one that gives me the least pause because corporations will be okay. I don’t think I need to worry about them. That’s my take.
[00:26:53] Stephanie Wierwille: Yeah. Well, okay. I like, I like that. You know, maybe we can prioritize here. I think just the one last thing I’ll say [00:27:00] is that the importance of data privacy, security, and legal in this area is super critical. So don’t go it alone in your organization. And you know, we have entire conversations we have with organizations, especially CMOs around how to lead out on an AI initiative.
But I think it’s just becoming more and more clear. Do not go it alone. Don’t assume that, you know, that’s really important. Um. So why don’t we move into maybe the ones that you all are most, um, passionate about here. So, Des I know you’re very, you’ve been doing a lot of homework around the environmental impact. Um, so I, I’m not even gonna set this up or give any stats. Des I, if you wanna take this one and kind of share what you’re seeing, um, ’cause you’ve been kind of bringing Chris and I up to speed in this area in the last months.
[00:27:46] Desiree Duncan: Yeah, I mean the conversation is definitely around the data centers, right? And the effect, the environmental effects there. You know, water being needed for cooling, uh, as well as the carbon emissions. Um, but what brought this mostly to my attention is some of the articles [00:28:00] that are around, you know, what communities are most.
Affected by these, and it got me thinking about, and so we’re seeing, uh, you know, a lot of headlines and, you know, um, stories around the effects of, especially from the carbon emissions in Memphis, uh, as well as other states that are dealing with the, the, the water supply. Um, but you know, just something to be mindful of, you know, people often think like, okay, you know, the, the plight of folks like who’s actually being slighted here and like, are they, are these. being, um, specifically targeted, uh, for this, and I would say follow the money. The most important color to these corporations is that color green and that decisions are often, you know, made based off of the economics. So the intent is to have a net positive economic outcome, which is money saved or money earned, right? But then that also creates social. Impacts and it affects groups more than others, which is incredibly unfortunate. Um, so when you think about these data centers, [00:29:00] centers, most of them are surfacing on the southern half of the states, not just the south, but the southern half. So whether that’s, um, you know, from Georgia to California, um, that’s because the land is cheaper.
There are high. Tax incentives for these businesses to show up there. Uh, but then when you really look at what’s the difference in some of the harmful effects of these areas, um, you really look at like, what are some of the decisions that these states have made, whether it’s bringing in new industry. Or even just having lax laws.
So for example, you know, data centers that are in, you know, south southeastern states, you know, they’re typically more polluting due to the reliance on local fossil fuels, plants, natural gases are being used, which is causing that emission, uh, carbons, um, insufficient investment in renewable. Energy and just weaker regulations around, around air and water pollution, which is deeply concerning.
So if a company doesn’t have to do all of these regulatory things, then they’re not right. [00:30:00] Versus when you’re seeing some of the data centers that are more, uh, um, the southwest, but then also including, um, the largest portion of these being in Northern Virginia as well as Illinois in the Chicago area. are mostly, uh, the largest amount of infrastructure, but they have stricter permitting. You know, there are grid scale renewables that they’re able to plug into. They’re creating more innovative cooling systems. Again, this is all based off of the, the laws and the regulations of that state. So if a state, like a California, typically, if they’re putting more stipulation on making sure that, hey, this isn’t as environmentally toxic as, um. Obviously it still is, but if they’re putting rules and regulations around that, that keeps that down and they’re actually gonna take the steps versus if you are in a state that doesn’t have that, it’s, it’s, they’re not like, why put in, invest in something that is gonna cost you more money in the long term when you’ve already signed on for this, you know, lower, um, uh, cost.
So just when you’re [00:31:00] thinking about that, just kind of keep that in mind. But I know I just like rambled on. Like what are some of your thoughts around this?
[00:31:06] Chris: I mean, that’s all amazing. De I have four specific thoughts. First, there was a somebody, Stephanie, hopefully you can help me out. Maybe Des you can, um, a very high level AI person. So somebody’s been around, I don’t know if somebody we know a name, um, who in some interview within the last couple months said, I could see.
The earth blanketed with data centers. Like it’s very dystopian, like almost the matrix looking when you, when you think about that. So, you know, hopefully that’s not where we’re going, but that, I mean, I think there are people that would be like, sure, why wouldn’t we do that? Um, I’m gonna drop two book, um, recommendations.
The first is Empire of ai. Which is a phenomenal job of looking at some of what we’re talking about, not just environmental, but it does a lot of around environmental and it’s making, uh, a comparison to empires of old and how empires are built and grow. And one [00:32:00] of the tools or strategies of empires is exploitation.
So the exploitation of natural resources a thousand percent, um, is exactly what De is talking about. So definitely read that. The other one, which sounds like it’s not related, it’s a great book. It’s called Murder Land. Uh, it is about, uh, an author who has found a correlation between all the serial killers that emerged in the seventies and eighties, uh, and their proximity to industrial pollution.
Specifically in Tacoma, but also the Pacific Northwest. Um, it’s a phenomenal book ’cause it’s about, if you like, true crime, it’s about that, but it’s also about the society. And one of the threads in there is how these companies that were basically pumping, arsenic and lead into the environment. Um, remember we had leaded gas until the late seventies, um, would literally just lie.
Very much like the cigarette companies lied about the impact of of [00:33:00] nicotine cigarettes. They would just lie about the impact of arsenic and lead. There’s even one part of the story where she points out that, uh, one of the huge lead smelters or whatever it is in the Pacific Northwest, shut down. And they had to do a cost benefit analysis of running, so a part of it shut down that actually helped mitigate the pollution.
What was the cost benefit analysis of keeping it going versus how much it’d have to pay for every kid? That would die from lead poisoning and their cost benefit analysis. This is all public record now show that it’s actually better off for the company to move forward and just pay off the kids families.
So my point in that is, again, last time I said I think the corporations will be fined. We’re going to be hearing so much about, oh, this is just no big deal. No problem. It’s not gonna be a big issue. You guys are thinking about all wrong. Do not trust that. I’m not saying they’re gonna be lying again, but.
Des said it like the color that is [00:34:00] most important is green. Not the green. We’re talking about money green. And so I just have that, uh, that skeptical eye. And then the last thing I’ll say, I said four points. The last point is, um, we have roots in northern Wisconsin. It is so pristine. I feel like it’s like the last bit of the upper Midwest that has been not discovered by people like putting up giant mansions and all of it.
Um, and we actually have. Um, roots right on Lake Superior. And so I am just like radar up because you hear about the Great Lakes being this amazing source of fresh water. Uh, I do not want these data, data centers up in that area and destroying the environment and destroying the amazingness. And I hope des that because of, uh, the folks up here who understand the importance of environment that that won’t happen.
Uh, but money is money. So there you go. That was long too, but uh, this is a big one.
[00:34:56] Stephanie Wierwille: Yeah. And I, I think it’s, you know, you’re, I’m gonna dig into your point about [00:35:00] how there are all these different points being thrown out of, oh, it’s not that much. So, like, just to give some examples, ’cause I try to think about what can I do as an individual, right? Who’s in this every day? Who’s, out here prompting like crazy? And so, you know, there’s, there’s all these conflicting information about how much energy does, does it really take? So for example. I think one recent study has shown that, um, a a, a typical median in gener generative AI interaction uses 10 times more energy than a Google search. So that tells you it’s already a lot.
And we also know the way that you work with a generative AI tool is. Far more back and forth than you would with Google. So to put it in context, that’s a lot. On the other side I’ve seen where it says like, oh, it’s actually more equivalent to running a microwave. So for one second. So that’s like, oh, it’s not that much.
But then what? What I think really got my wheels turning was. Um, some initial data that’s showing that what we
[00:35:51] Chris: So.
[00:35:51] Stephanie Wierwille: to generate energy through these data centers and what we’re already at is somewhere around the energy intake size of countries bet between the size of an energy [00:36:00] intake of Saudi Arabia and France in terms of the global energy intake that’s required currently. So that blew my mind to be like, oh, we’re adding another country of that size to our overall and, and that’s just going to exponentially increase and increase over time. So I think that’s really hard to deal with because then you get to yourself and you say, what do I do about this? Um, and how do I, how do I handle this?
Do I prompt less? Do I go back and forth? Do I not use these tools? Do I buy an electric car? Like that’s where I’m at. I think.
[00:36:34] Desiree Duncan: Yeah, that’s where,
[00:36:34] Chris: We’ll get to what you can do about it, but Sorry, de I was gonna say at the end of the show, we’ll, we’ll have some suggestions, but it’s not, you know, none of these are easy to, to deal with. Sorry, Des
[00:36:45] Desiree Duncan: no, I was just gonna say that, like, yeah, just thinking about like, uh, the, the con, the internal conflict, but we’ll get to that.
[00:36:52] Stephanie Wierwille: Sentence. Okay. Let’s hit our last
[00:36:53] Chris: de.
[00:36:54] Stephanie Wierwille: last bucket here so we can get to that part. Um. Gosh, this one is really overwhelming. [00:37:00] Um, which is the broader societal risk. So as you zoom out to the collective future of humanity, what is that broader societal risk? So we think about, we’ve, we’ve talked in the past about things like misinformation and deep fakes and what that leads to from a political standpoint.
I just saw this term I was probably new to me only, but truth decay, this broader sense of truth decay, um, which of course we covered in Joe Public. Um. Cultural division and social unrest. Even things like, you know, AI leading to additional kinds of weapons and weaponizing biology and, um, being used in defense. Um, the future of education. Okay. Uh, how do we think about, you know, educating children moving forward, and where is AI helpful and harmful? Um, and then lastly is, you know, this broader socioeconomic exploitation. Jazz already mentioned things like, uh, the, the global south being what’s usually leaned into in terms of data centers, but then that’s also, you know, we talked about jobs and, um, the divide that’s coming [00:38:00] there.
So I’m just gonna stop there, throw all that out at you, all, let you react, and then we can come back to like, how do we deal.
[00:38:08] Desiree Duncan: Uh, again, help. Um, I’ll. jump with the education. I mean, there is you know, more of a focus, like we’re hearing about the graduates who have just, you know, gotten outta school. They have no AI training in that regard, but it’s starting to become more of a mandate both in K through 12 as well as in higher ed. And so it’s like, you know, how are we properly educating our students around this? Right. But then there’s also just the use of this, of just in that social unrest you’d mentioned of, you know, the. Potential of what can be. I, even now when I, when I’m on my social feeds and I see a video that like, looks like, oh my God, that is so disastrous and what is it?
I’m like, I’m looking for signs of, is this actually real? Did that actually happen? Like, you know, if it does it have a little bit of that, um, AI video bleed to let me know that this is actually not real. But that a, as that improves, that’s [00:39:00] gonna get harder and harder to suss out because everything is gonna look hyper real. Um, but like. Again, we’ll talk a little bit about this in a bit, but like there are hints of this that we’re already kind of seeing in, in movies and, and television shows that are, that are talking about this, but
[00:39:15] Chris: Mountain Head, we already talked about Mountain Head is exactly what you’re talking about. And I also think like you can look to countries that have significant unemployment issues, particularly with young people. And you can see the social unrest that comes from that. Uh. And we’ve seen that over the last decade or two.
I mean, I think it’s probably throughout human history, but if, if we do really start to see, I mean we set about an individual impact on your job and your career, but if we start to see wide swats of people just not being able to get real decent jobs to actually support themselves and we see unemployment at, imagine it at like.
[00:40:00] 15, 20%, which is great depression level. Uh, there will be civil unrest there. There’s just, it’s just history will tell us there will be, uh, because people will be so desperate, uh, that they won’t have any, any other means to, to basically fight for survival. So again, this is all dystopian and terrible, but it’s also.
Like if, if we on one hand are meaning, like imagine what we can do with AI in three years of marketing and Joe Public retreat, we’re gonna explore this amazing vision. We have to be realistic about potential flip side of that. And so, uh, I am not encouraged that anybody with any kind of real authority.
Is either planning for this stuff or maybe even cares about it. I mean, I think the people with real power in this country actually are just like, Hey, progress, you just gotta go with progress. You gotta keep moving forward. Doesn’t matter what happens. That’s the world. Um, so [00:41:00] I guess we’ll get to like, what can you do about it?
But, um, you know, everybody’s got a part to play if we want the right future.
[00:41:07] Stephanie Wierwille: So let’s get into that. What do we do? I just, I just wanna just very quickly note, I think from a healthcare standpoint, which is the industry that live and breathe every single day, there are so many healthcare implications of what we just talked about and health implications. I mean, from the environmental impact standpoint to, you know, what you were just talking about, Chris, with like, you know, let’s, let’s expect. Unfortunately civil unrest that may come out out of this and unemployment, like all of that has extreme healthcare implications. So I think I’ll just start this. What do we do? Conversation by saying, I think one thing to do is at the, at the organizational, but also the public health level, sit down and scenario plan.
You know, what does this look like from a health standpoint? What does this look like from a healthcare standpoint? What can you, especially as a large healthcare organization be doing to be even helping those in at, [00:42:00] you know, in charge in various ways. Think through some of this and plan some of this. So I’ll start the conversation with that, but I’m gonna toss it over.
Des I know you have a lot of thoughts on, you know, just even how do you think about sociology and, um, considering history in terms of what do we do now?
[00:42:18] Desiree Duncan: I mean, such a history nerd. You, you actually, Stephanie mentioned this, um, earlier this week around just the, anytime that there’s societal progress, there are unintended consequences, I think by you all. Noah, I, I’m, I’m not
[00:42:31] Stephanie Wierwille: you all know a Harari. He is my, he is my fave.
[00:42:35] Desiree Duncan: Yeah. So then of course like I dove into like, okay, right.
’cause there have been all of these, you know, the Renaissances, industrial revolutions, information, age, all of that good stuff where, you know, we mostly look at like what was the great expansiveness that happened during that time? But there was also like all of the stuff that, you know, created great disparity that happens that we don’t think about.
So with this AI, we [00:43:00] also wanna just be cognizant of great advancement, also comes with some great risks. So, looking at the Renaissance, I’ll, I’ll give a couple of examples. Um, you know, with the Renaissance, you know, there’s, that created, you know. Mass literacy. You know, the printing press came throughout that time, which democratized knowledge, but it also created that social upheaval, censorship, pamphlet wars, religious conflicts, you know, that the colonial exploitation, you know, a lot of that, you know, begat.
Then the Industrial Revolution. I’m actually watching the Gilded Age right now. I just watched the uh, Edison episode where they got electricity. So you got electricity in mass, you know, new employment ops, transportation, revolutions, the railroad. But also you had major railroad wrecks, people died. Um, dangerous working conditions, child labor was a thing.
The environmental pollution from that. And then as we’re getting into our a age of information, you know, that gave us such global connectivity, we’re able to communicate with folks across, um, you know, [00:44:00] various continents to tech boom that also created privacy erosion misinformation that we just mentioned before. Um, and then. Biotech medical breakthroughs, personalized medicine, disease eradication. But then now we have over genetic modification, like some of our food isn’t even real anymore. There was this cross pollinated, which then to your point, Stephanie, uh, has great effects on our health. You know, what we’re putting into our bodies isn’t always, you know, natural. Foods anymore, at least in this country. Um, so again, like the, the history of this and that, looking at, you know, ai like we’ve been talking and being so, uh, bullish about the positivity of it, we, it would be remiss for us to not mention like, here are the other sides that we need to make sure that we’re seeing, uh, abreast of any other, takes any ads from folks.
[00:44:51] Chris: I’m trying to find the quote. Uh, but when you’re talking it, it remind about these, it reminds me, I think it’s Jurassic Park, Dr. Ian Malcolm, [00:45:00] who’s the guy, the chaos theorist. I think he said something like, rebirth is brutally violent. Um, so he’s kind of talking about like, oh, you’re bringing back dinosaurs.
You’re not realizing that’s gonna be a violent event. Um, and I think like a lot of what you’re talking about is basically some form of violence, right? It may not be physical violence, but it could be societal violence or whatever. Um, the other thing I’ll say about this is. Um. You know, we certainly can use these other situations to learn from and plan for, but also I think, I think at the Black Swan, um, theory, which is you, you know, like sometimes something comes up that you’re not expecting and you do hear like AI’s going to be.
Different, right? Because you hear a lot, like whenever we have these new technology breakthroughs, whether it’s the wheel or whether it’s, you know, like the printing press or whatever, um, there’s a huge disruption, but it actually creates more jobs. It’s just different jobs. Um, as one example. And you do have to be thoughtful about is that really gonna happen again this time?
Maybe it will. [00:46:00] Um, I don’t know. But, uh, it’s just a lot to, to think about.
[00:46:05] Stephanie Wierwille: Yeah. Yeah. And I think that the reality, the honest reality is we don’t know. You’re right. Like you just said, Chris, like so many people keep asking like, is it gonna create jobs or remove, it’s like, I don’t know. We don’t know. And we didn’t know social media’s consequences until we were. 10 years past social media evolving. And to your point about all these various revolutions in human history, does like, I just, the reason I like you all, Noah Hara so much, well, there’s a hundred reasons, but written five books at least that I’ve read, and the first one, sapiens makes the big point that it’s the agricultural revolution that put us on this path that is way back in human history.
I never have ever sat down and thought like, oh. A lot of what we do today that we don’t like comes from that decision. But would any of us go back in time to be nomads like hunters and gatherers? No, we wouldn’t. Would any of us go back in time to be on the horse and buggy path, or not have railroads or not have the internet?
I don’t think that we would. So that’s the [00:47:00] point is not to say. Yeah. The way, I don’t think the point is to say like, let’s not move forward with the progress. The point is to say, let’s be smarter this time. Let’s, let’s collaborate better this time. You know, let’s scenario plan better this time. Let’s, let’s look ahead a little bit more instead of wait until we have a mental health crisis from social media.
So think that’s at least what I feel like we should be doing.
[00:47:24] Desiree Duncan: And, and that brings us to
[00:47:25] Chris: We should be.
[00:47:26] Desiree Duncan: Yeah. It brings us to our next point around, you know, staying educated to stay up to speed, pay attention. Right. And, you know, if you’re not a big news fan, like I. I’m not, but like I always kind of pay attention to the signs that are showing up in the world. Right? So I, like I mentioned the, the Gilded Age, it’s always interesting. I’m watching the first season, which was several years ago, but there’s nods to kind of what we’re experiencing today. You know, when they’re like, oh my God, the electricity, this is great, but should we, and I just was like. Oh, this is ai. Um, but thinking about other films that have touched on as like, with her, that keeps coming up [00:48:00] where this guy developed this relationship with, uh, his software, but, and like, oh, that could never possibly happen, but that’s happening. Edington. I kept trying to get y’all to watch that because so much of the conversation about that movie was focused around like, oh, the divisiveness and this and the third of like, what happened during COVID times. But in reality, the whole, the movie was about a data center being built in the
[00:48:21] Stephanie Wierwille: Hmm.
[00:48:21] Desiree Duncan: a New Mexico city, but no one was paying attention to that.
So again, there are always signs that are showing up within pop culture, whether it’s music, movies, uh, television, just pay attention and just, just you need to do with, just kind of connect the dots, use your own, uh, internal AI to connect the dots with that.
[00:48:41] Chris: Here we say do your own research.
[00:48:43] Desiree Duncan: Oh. That
[00:48:47] Chris: It is okay. It’s okay. I know. I know.
[00:48:50] Stephanie Wierwille: Yeah. Um,
[00:48:51] Chris: yeah, you gotta stay smart, stay educated.
[00:48:54] Stephanie Wierwille: A lot to chew on here. Um, I think we had so many food metaphors throughout this, whether it’s broccoli or meat or [00:49:00] pumpkins. Um, lots to chew on. So, yeah.
[00:49:03] Chris: A broccoli meat stew. Yeah. Okay.
[00:49:05] Stephanie Wierwille: That’s what it feels like a little bit. Ugh. A little yucky. so yes, so, so with that we’ll close here, we’ll wrap here. Um, it’s always, it’s getting harder and harder to close these shows. ’cause I think, like we’re talking about some, some really complex topics. But, um, for anyone listening. Let us know if you have an ad.
If you have a giant solution to this problem, we’d love to hear it. Uh, shoot us a note at no normal at BPD Healthcare and share the show with friends and colleagues. We always love any reviews and ratings You can leave and until next time, don’t be satisfied with normal. I think that would be the way to walk into a. Bad future if we were all just sitting here satisfied with whatever happens, happens. So the whole point of the show is to push the no normal, to think about those things, to connect the dots in our own organizations and world. Um, so with that, talk to you next week.
[00:49:56] Desiree Duncan: I you a do [00:50:00] aci. It’s been a pleasure and can’t wait to see what comes for the new normal. Bye.