Creative & Ethical Activism Featuring Denise Love Hewett and Wendell Wallach

Creative & Ethical Activism Featuring Denise Love Hewett and Wendell Wallach

In this episode, the multi-talented Denise Love Hewett talks about the intersection between creativity, leadership and activism and Wendell Wallach, “the Godfather of AI Ethics”, discusses the many challenges in the expanding world of artificial intelligence.

 


00:00:00 --> 00:00:06 Welcome. I'm Erik Fleming, host of A Moment with Erik Fleming, the podcast of our time.
00:00:06 --> 00:00:08 I want to personally thank you for listening to the podcast.
00:00:09 --> 00:00:12 If you like what you're hearing, then I need you to do a few things.
00:00:13 --> 00:00:19 First, I need subscribers. I'm on Patreon at patreon.com slash amomentwitherikfleming.
00:00:19 --> 00:00:24 Your subscription allows an independent podcaster like me the freedom to speak
00:00:24 --> 00:00:27 truth to power, and to expand and improve the show.
00:00:28 --> 00:00:32 Second, leave a five-star review for the podcast on the streaming service you
00:00:32 --> 00:00:35 listen to it. That will help the podcast tremendously.
00:00:36 --> 00:00:41 Third, go to the website, momenteric.com. There you can subscribe to the podcast,
00:00:42 --> 00:00:47 leave reviews and comments, listen to past episodes, and even learn a little bit about your host.
00:00:47 --> 00:00:51 Lastly, don't keep this a secret like it's your own personal guilty pleasure.
00:00:52 --> 00:00:56 Tell someone else about the podcast. Encourage others to listen to the podcast
00:00:56 --> 00:01:02 and share the podcast on your social media platforms, because it is time to
00:01:02 --> 00:01:04 make this moment a movement.
00:01:04 --> 00:01:10 Thanks in advance for supporting the podcast of our time. I hope you enjoy this episode as well.
00:01:11 --> 00:01:16 The following program is hosted by the NBG Podcast Network.
00:01:21 --> 00:01:56 Music.
00:01:23 --> 00:01:23 Thank you.
00:01:28 --> 00:01:28 Thank you.
00:01:53 --> 00:01:54 Thank you.
00:01:56 --> 00:02:01 Hello. Welcome to another Moment with Eric Fleming. I am your host, Eric Fleming.
00:02:02 --> 00:02:13 And today I have two guests who are very distinctive in their own way.
00:02:14 --> 00:02:17 They're both activists in their own way.
00:02:18 --> 00:02:22 And it'll really be,
00:02:23 --> 00:02:33 Well, it is fascinating to talk to people like this and get their perspectives on the world.
00:02:33 --> 00:02:44 One is a very multi-talented individual in the world of entertainment and hosts
00:02:44 --> 00:02:46 her own podcasts, all that kind of stuff.
00:02:47 --> 00:02:52 And how she uses her talents to be an activist.
00:02:52 --> 00:02:59 And then my other guest is an individual that has, for years,
00:03:00 --> 00:03:08 really focused in on one particular issue and has been trying to be the voice of reason.
00:03:10 --> 00:03:19 And really be an activist to do the right thing in that one particular realm that he's focused on.
00:03:19 --> 00:03:28 So I hope that you enjoy these guests and enjoy, respect their perspective and
00:03:28 --> 00:03:31 just enjoy the conversation we were able to have.
00:03:32 --> 00:03:36 You know, 30 minutes is a long time to talk to somebody, but,
00:03:36 --> 00:03:41 you know, the one thing about the guests that I have is that you really want
00:03:41 --> 00:03:44 to talk to them for as long as you can.
00:03:44 --> 00:03:48 We just try to stay within a certain time frame.
00:03:48 --> 00:03:54 But anyway, I hope that y'all will enjoy listening to them.
00:03:54 --> 00:03:59 A couple of housekeeping notes. One, last week was Grace G's birthday.
00:04:00 --> 00:04:06 So allow y'all listeners to wish her a happy belated birthday on that.
00:04:06 --> 00:04:11 And we're still trying to get 20 subscribers on Patreon.
00:04:11 --> 00:04:16 So you can go to patreon.com slash amomentwithErik Fleming and go ahead and do that.
00:04:17 --> 00:04:20 Some weeks I'll remember to say it, others I won't.
00:04:20 --> 00:04:23 But just know that that's ongoing until we get it.
00:04:23 --> 00:04:28 And I will definitely let you know when we reach that threshold.
00:04:29 --> 00:04:38 Just for the importance of keeping this podcast independent and unencumbered, right?
00:04:40 --> 00:04:45 So, you know, a lot of stuff is what we've learned with this administration.
00:04:45 --> 00:04:47 A lot of stuff is going on.
00:04:48 --> 00:04:54 And, you know, we do our best to try to, you know, between me,
00:04:55 --> 00:05:00 Grace, you know, try to keep you informed about what's happening.
00:05:00 --> 00:05:04 So let's go ahead and get this program started. And as always,
00:05:04 --> 00:05:08 we kick it off with a moment of news with Grace G.
00:05:10 --> 00:05:15 Music.
00:05:15 --> 00:05:20 Thanks, Erik. President Trump ordered federal agencies to grant the military
00:05:20 --> 00:05:24 authority over borderlands to build walls and install security infrastructure
00:05:24 --> 00:05:26 under a national emergency declaration.
00:05:27 --> 00:05:31 A suspect was arrested for a shooting at Florida State University in Tallahassee,
00:05:31 --> 00:05:34 which left one person dead and six injured.
00:05:34 --> 00:05:39 The Trump administration threatened to revoke Harvard's tax-exempt status over
00:05:39 --> 00:05:43 alleged anti-Semitism and froze $2 billion in federal funding.
00:05:43 --> 00:05:48 A suspect is in custody after Pennsylvania Governor Josh Shapiro's residence
00:05:48 --> 00:05:50 was damaged in an arson attack.
00:05:51 --> 00:05:58 El Salvador's president, Bukele, refused to repatriate Kilmar Abrego Garcia, disregarding a U.S.
00:05:58 --> 00:06:01 Supreme Court order directed at the Trump administration.
00:06:01 --> 00:06:06 Salvadoran authorities allowed U.S. Senator Chris Van Hollen to meet with Abrego
00:06:06 --> 00:06:07 Garcia while visiting the country.
00:06:08 --> 00:06:13 A judge ruled Palestinian activist Mahmoud Khalil deportable under Trump's policies,
00:06:13 --> 00:06:17 denying his request to subpoena Secretary of State Marco Rubio.
00:06:17 --> 00:06:22 Two New York prison guards were charged with murder and eight others with related
00:06:22 --> 00:06:26 crimes for fatally beating an unarmed inmate, Messiah Nantwi,
00:06:26 --> 00:06:29 at Mid-State Correctional Facility in March.
00:06:29 --> 00:06:34 The NAACP sued the U.S. Education Department to block funding cuts targeting
00:06:34 --> 00:06:38 diversity programs for black students. The U.S.
00:06:38 --> 00:06:43 Military and Air Force academies will end race-conscious admissions policies
00:06:43 --> 00:06:46 following Trump administration directives. U.S.
00:06:47 --> 00:06:51 Prosecutors are reviewing a case involving a former FBI informant who admitted
00:06:51 --> 00:06:56 to fabricating bribery claims against Joe and Hunter Biden, seeking his release during an appeal.
00:06:56 --> 00:07:01 A federal judge rejected a challenge to immigration enforcement in places of worship.
00:07:01 --> 00:07:06 A Los Angeles judge approved resentencing hearings for the Menendez brothers,
00:07:06 --> 00:07:12 potentially making them eligible for parole after 35 years in prison for murdering their parents.
00:07:13 --> 00:07:18 And measles cases in Texas and New Mexico rose to 624.
00:07:18 --> 00:07:22 I am Grace Gee, and this has been a Moment of News.
00:07:24 --> 00:07:29 Music.
00:07:29 --> 00:07:32 All right. Thank you, Grace, for that moment of news.
00:07:32 --> 00:07:38 And now it is time for our guest, Denise Love Hewett.
00:07:39 --> 00:07:45 Denise Love Hewett is the host of the podcast, Do the Work, which focuses on
00:07:45 --> 00:07:48 redefining leadership to be more holistic.
00:07:48 --> 00:07:53 This was birthed out of her time as a founder of Scripted, an entertainment
00:07:53 --> 00:07:58 tech company working to dismantle the inefficient and exclusionary practices of Hollywood.
00:07:58 --> 00:08:03 She comes from a world of entertainment, fashion, and hospitality,
00:08:04 --> 00:08:09 starting her career as a director of sales and marketing at famed nightclub
00:08:09 --> 00:08:15 The Box and working for cultural trailblazers like Patricia Field of Sex and
00:08:15 --> 00:08:17 the City and Ugly Betty fame,
00:08:17 --> 00:08:20 Marvin Jarrett from Nylon Magazine.
00:08:20 --> 00:08:26 Tyra Banks, Courtney Love, Simon Hammerstein, and Randy Wiener before creating
00:08:26 --> 00:08:29 content at MTV and Endable.
00:08:29 --> 00:08:34 She is a seasoned television and digital producer who lives at the intersection
00:08:34 --> 00:08:37 of activism, entrepreneurship, and entertainment.
00:08:37 --> 00:08:41 She is also a professional DJ who has worked with Vanity Fair,
00:08:42 --> 00:08:50 Gucci, Shishido, Sony, Glow Recipe, Oprah, Hilary Duff, among many others.
00:08:50 --> 00:08:56 She holds a bachelor's degree from Gallatin at NYU, where she studied cultural
00:08:56 --> 00:09:00 signifiers, how they affect and reflect society.
00:09:00 --> 00:09:05 And she has just started a brand new podcast called Too Much.
00:09:05 --> 00:09:09 Ladies and gentlemen, it is my distinct honor and privilege to have as a guest
00:09:09 --> 00:09:14 on this podcast, Denise Love Hewett. Music.
00:09:15 --> 00:09:24 Music.
00:09:24 --> 00:09:29 All right. Denise Love Hewett. How are you doing, ma'am? You doing good? Hi.
00:09:30 --> 00:09:34 I'm doing as good as one can be in the times that we're living in.
00:09:35 --> 00:09:37 Well, that's definitely understandable.
00:09:37 --> 00:09:41 And like I tell people all the time, this podcast is my therapy.
00:09:41 --> 00:09:43 So welcome to the session.
00:09:43 --> 00:09:46 I greatly appreciate you participating in that.
00:09:47 --> 00:09:49 Checking in. I love therapy. Yeah.
00:09:50 --> 00:09:56 So I kind of the way I do my thing, I've listened to your podcast.
00:09:56 --> 00:10:01 The kind of way I do my thing is I do some icebreakers at the beginning.
00:10:01 --> 00:10:05 So the first icebreaker is a quote.
00:10:05 --> 00:10:10 Do the best you can until you know better. Then when you know better,
00:10:10 --> 00:10:13 do better. What does that quote mean?
00:10:14 --> 00:10:20 I think that's Oprah's number one phrase she uses. I think it was Maya Angelou
00:10:20 --> 00:10:21 who recently said it to her.
00:10:22 --> 00:10:26 But, you know, I think that we don't always know what we don't know.
00:10:27 --> 00:10:30 So a lot of times, sometimes when people are behaving a certain way,
00:10:30 --> 00:10:34 it's because they're maybe not aware of how they're behaving.
00:10:34 --> 00:10:36 And then when they get the awareness, they can change that behavior.
00:10:36 --> 00:10:43 And so for me, that's the goal is make people aware so we can all do better.
00:10:43 --> 00:10:47 So most people want to. And when you do know better, you do do better.
00:10:47 --> 00:10:53 Yeah. All right. So the next icebreaker, I need you to pick a number between 1 and 20.
00:10:53 --> 00:10:59 12. Okay. What advice do you have for recognizing fake news,
00:11:00 --> 00:11:03 propaganda, misinformation, disinformation, whatever?
00:11:04 --> 00:11:08 You don't have to be able to recognize it. You have to be able to be vigilant to check it.
00:11:08 --> 00:11:13 The biggest thing is fact-checking and using Google to check your sources.
00:11:14 --> 00:11:18 And recently in my family thread, my mother sent this thing in our text chain.
00:11:19 --> 00:11:23 And I was like, this feels like conspiracy theory to me. Can you please tell
00:11:23 --> 00:11:24 me like where this came from?
00:11:24 --> 00:11:26 And then she's like, oh, well, I haven't fact-checked it yet.
00:11:26 --> 00:11:29 And I said, okay, well, I'm making a rule for this family chat.
00:11:29 --> 00:11:33 It can only be things in the family chat that you have checked before you put
00:11:33 --> 00:11:36 in here. I don't want to go down the tunnels of things that aren't true.
00:11:36 --> 00:11:41 We have to be really vigilant because that is one of the main ways we keep calm
00:11:41 --> 00:11:44 and keep control in this time is to check.
00:11:44 --> 00:11:48 So you don't have to necessarily be able to recognize it. You just have to do your job to check it.
00:11:49 --> 00:11:54 Yeah. Yeah. And that's very important, you know, because a lot of people just,
00:11:54 --> 00:11:57 you know, we get into these silos, right?
00:11:57 --> 00:12:02 And, you know, we want to, I'm old enough to remember Walter Cronkite.
00:12:02 --> 00:12:06 I don't know if, I don't think you are, but I know who Walter Cronkite is,
00:12:06 --> 00:12:11 but I mean, it was like, he was the person that we trusted. So if he told us
00:12:11 --> 00:12:13 this happened in the world and that happened. Right.
00:12:14 --> 00:12:18 And, you know, we don't, we don't have any Walter Cronkites anymore,
00:12:18 --> 00:12:19 or even a Peter Jennings.
00:12:19 --> 00:12:24 I mean, it's just, you know, so, but people still have that desire to trust
00:12:24 --> 00:12:30 the person that they're listening to, whether it's on a podcast, cable news, whatever.
00:12:31 --> 00:12:35 And, you know, people have taken advantage of that. So I think,
00:12:35 --> 00:12:36 you know, it's just as important.
00:12:37 --> 00:12:43 You know, I remember a pastor once saying, don't just take my word for it in
00:12:43 --> 00:12:44 a sermon, read the book yourself.
00:12:45 --> 00:12:50 And I think that's kind of the mantra we need to have in this political dialogue.
00:12:53 --> 00:12:56 I may ask you this question. I'm going to skip that one and get right into this
00:12:56 --> 00:13:01 one. What is it like being a creative in this political atmosphere we're in?
00:13:02 --> 00:13:08 Creatives thrive in environments where they have meaningful things to say and work to give.
00:13:08 --> 00:13:18 And that is the unfortunate truth, is that we are needed most in times of tension and frustration.
00:13:18 --> 00:13:22 Because people are looking for release, they're looking for support,
00:13:22 --> 00:13:29 and creatives have that ability to contextualize the world we live in. in a very heartfelt way.
00:13:29 --> 00:13:33 And so that's what's the weird part about living in a time like this is that
00:13:33 --> 00:13:39 you never feel more called to action, but it's very sad for the creative to
00:13:39 --> 00:13:42 feel like I want to be of service in this really hard time,
00:13:42 --> 00:13:45 but I wish this time didn't have to be so hard.
00:13:46 --> 00:13:54 Yeah. Yeah. So as far as you go, right, do you feel that...
00:13:55 --> 00:14:02 It's more of a service you're providing in this time or that it's more of a
00:14:02 --> 00:14:07 an opportunity to be more creative and to be more expressive.
00:14:08 --> 00:14:12 My ultimate goal as an artist is to always be of service. I believe that we
00:14:12 --> 00:14:17 all have a calling we're brought here for and our ultimate job while we're here
00:14:17 --> 00:14:22 is trying to figure out how to make that call like to be of service align with
00:14:22 --> 00:14:25 a financial like reality.
00:14:25 --> 00:14:29 And so everything I do, I look at it from that angle.
00:14:29 --> 00:14:33 How can I be of the most service with the gifts that I have?
00:14:33 --> 00:14:37 And so I've often as a creative struggled because at times where I've had to
00:14:37 --> 00:14:40 do jobs for money that are not soul fulfilling or feel like they're a part of
00:14:40 --> 00:14:44 my path, it feels like a waste of what I'm here to do. And in the times when
00:14:44 --> 00:14:47 I'm living fully in service, I feel the most whole.
00:14:48 --> 00:14:53 And so that's how I look at it. It's not necessarily binary.
00:14:53 --> 00:14:55 I think that for me, they go hand in hand.
00:14:57 --> 00:15:01 What problem do you have with pattern matching?
00:15:01 --> 00:15:06 You seem to have gotten famous saying that I got a problem with pattern matching.
00:15:06 --> 00:15:10 What is your problem with pattern matching? So historically,
00:15:11 --> 00:15:17 if society, especially American society, has predominantly run and been led
00:15:17 --> 00:15:21 by white men, then the pattern and data that we have in this country reflects that.
00:15:21 --> 00:15:27 So when you're raising capital or pitching a film script or starting a company,
00:15:27 --> 00:15:30 people are looking at data that isn't fair.
00:15:30 --> 00:15:34 So my issue is that when you look at that level of data and you're trying to
00:15:34 --> 00:15:39 de-risk your investment choice or your creative choice, you're actually hindering
00:15:39 --> 00:15:42 someone that doesn't fit that pattern.
00:15:42 --> 00:15:45 You're limiting the level of innovation and creativity we can see in the world
00:15:45 --> 00:15:48 because I am never going to fit that pattern.
00:15:48 --> 00:15:51 That pattern wasn't built for me. It wasn't built for people of color.
00:15:51 --> 00:15:56 It's not something we can use as a metric to de-risk anything.
00:15:56 --> 00:15:59 So, for example, in Hollywood, which is my background...
00:16:00 --> 00:16:04 International films have historically been led by a white male lead.
00:16:04 --> 00:16:09 So when we started diversifying those leads, like Black Panther is a great example,
00:16:09 --> 00:16:15 there was a lot of feedback around, well, this won't sell internationally,
00:16:15 --> 00:16:17 when internationally we're more people of color than white people.
00:16:17 --> 00:16:20 So it's also like a flawed logical argument.
00:16:21 --> 00:16:24 But the data doesn't support that because we've never done it.
00:16:24 --> 00:16:28 So relying on that data to saying it's not going to sell internationally is a fallacy.
00:16:28 --> 00:16:31 And you won't know until you do it. And then, of course, what did Black Panther
00:16:31 --> 00:16:33 do? It blew everyone's expectations away.
00:16:34 --> 00:16:42 And so my point is that if we're going to be innovating, you can't let the data hold you back.
00:16:42 --> 00:16:47 And the pattern that exists is not the pattern that we need in today's culture.
00:16:48 --> 00:16:53 And innovation doesn't come from patterns. So that's my major problem with the
00:16:53 --> 00:16:57 system, is that it's not a reliable success metric or indicator.
00:16:57 --> 00:17:02 So people have called you a pioneer for speaking out on that.
00:17:02 --> 00:17:07 Do you feel that you're a pioneer or you just feel it though that you were given
00:17:07 --> 00:17:12 a platform to vocalize something that others have been trying to work on?
00:17:12 --> 00:17:16 I'm in a lot of good company for sure. I'm definitely not alone in speaking
00:17:16 --> 00:17:19 truth to power in terms of inequity.
00:17:19 --> 00:17:22 There's plenty of people that I have learned from that have paved the way for me.
00:17:23 --> 00:17:27 Specifically in my story, I was building a tech startup at the time and I was
00:17:27 --> 00:17:34 raising capital and facing a deeper level of inequity in venture capital while
00:17:34 --> 00:17:37 I was trying to build this company to create more equity in Hollywood.
00:17:38 --> 00:17:42 So it was this really like weird mirror moment in which I'm trying to solve for inclusion here.
00:17:42 --> 00:17:46 And yet I'm facing that same problem in my own life, trying to make this company happen.
00:17:47 --> 00:17:51 And through that, There was a lot of mixed feedback. Venture capital,
00:17:51 --> 00:17:56 just for the audience, 2% of venture capital goes to women, which is an insane,
00:17:56 --> 00:17:59 sane statistic, and even less so for people of color.
00:18:00 --> 00:18:02 So like, okay, 1% for people of color. So still to this day,
00:18:03 --> 00:18:07 the number one mechanism for building generational wealth is biased towards one type of person.
00:18:07 --> 00:18:10 Yet women are 70% of the purchasing power in this country.
00:18:11 --> 00:18:14 Beyond that, women get higher returns on capital invested. So they're actually
00:18:14 --> 00:18:17 a better investment, yet they get less capital.
00:18:17 --> 00:18:19 So all these things, I was looking at all this information and saying,
00:18:19 --> 00:18:22 none of this makes sense. And it wasn't even about my company.
00:18:22 --> 00:18:27 It was the fact that I was meeting, I was also like coaching a lot of entrepreneurs.
00:18:27 --> 00:18:33 So during COVID, I was on Clubhouse and I met an amazing, amazing single woman,
00:18:33 --> 00:18:35 single mother who was building a smart car seat.
00:18:36 --> 00:18:39 And we had talked about it. She had put a patent in and I was like,
00:18:39 --> 00:18:42 well, are you talking to investors? I was like, have you made an investment deck?
00:18:42 --> 00:18:44 And she was like, well, what's an investment deck? And I was like,
00:18:44 --> 00:18:47 of course, you wouldn't know about an investment deck because why would you?
00:18:47 --> 00:18:50 Like, it's a weird language we've created for people to create these barriers
00:18:50 --> 00:18:52 to entry for people to participate.
00:18:53 --> 00:18:56 And there's all this brilliant innovation all around us, people who are true
00:18:56 --> 00:18:59 entrepreneurs, who are scrappy. This mom was working two jobs on the weekends.
00:18:59 --> 00:19:01 She's working on the smart car seat.
00:19:01 --> 00:19:06 She was unbelievable and would not have been taken seriously by this institution
00:19:06 --> 00:19:08 because she didn't speak the same language.
00:19:09 --> 00:19:12 And i just was infuriated i
00:19:12 --> 00:19:15 was infuriated that we were leaving beautiful brilliance on
00:19:15 --> 00:19:18 the table i was infuriated that women weren't
00:19:18 --> 00:19:21 allowed to compete at the same level and so i just
00:19:21 --> 00:19:28 had to speak knowing that in some ways i was letting my company i was like i
00:19:28 --> 00:19:31 was falling on the sword for my company let's just put it that way so a lot
00:19:31 --> 00:19:34 of people a lot of investors said why would you talk about this like entrepreneurs
00:19:34 --> 00:19:38 and investors why would you speak about this you're just going to look like
00:19:38 --> 00:19:39 a depreciated asset in the market.
00:19:40 --> 00:19:43 Everyone's just going to think you can't raise money. Every founder has a hard
00:19:43 --> 00:19:44 time. You just look like you're whining.
00:19:45 --> 00:19:49 And I said, I really don't care. The reality is the data speaks for itself.
00:19:49 --> 00:19:53 And I'm not whining. I don't give a shit about my company at this point.
00:19:53 --> 00:19:57 I care about like hopefully leveling the playing field.
00:19:57 --> 00:20:03 So moving forward, other founders face more, like have more of an opportunity.
00:20:03 --> 00:20:06 That's really all I give a shit about. Because that's what my company was all
00:20:06 --> 00:20:08 about, was creating more opportunity.
00:20:08 --> 00:20:14 You once said that we have no control over how our work is perceived or supported.
00:20:14 --> 00:20:19 We can only make the thing we are called to make.
00:20:19 --> 00:20:22 Why was that an important lesson for you to learn?
00:20:23 --> 00:20:28 Because in a world and a country specifically in capitalistic country that really
00:20:28 --> 00:20:30 values external validation,
00:20:30 --> 00:20:37 it's hard as an artist to feel like your work matters or means something if
00:20:37 --> 00:20:42 it's not tied to external financial success, visibility.
00:20:43 --> 00:20:46 I've done a lot of different experiments, creative experience along the way
00:20:46 --> 00:20:49 that I feel like were, you know,
00:20:49 --> 00:20:52 you just in a silo that not many people
00:20:52 --> 00:20:55 engaged with or weren't that I thought were really meaningful that
00:20:55 --> 00:20:58 but engaged with you know the right few but I
00:20:58 --> 00:21:02 think that impact impact work is different and
00:21:02 --> 00:21:05 so that's really what I've come to post company I was
00:21:05 --> 00:21:08 talking to someone on my podcast her episode will come out in a
00:21:08 --> 00:21:11 couple weeks but she has created huge
00:21:11 --> 00:21:15 strides in athletics for pay equity for women professional
00:21:15 --> 00:21:18 athletics and the whole time she was doing that she was barely making money
00:21:18 --> 00:21:21 she was getting by yet she created this monumental shift
00:21:21 --> 00:21:24 for so many people and to me that is the true success
00:21:24 --> 00:21:27 and i in in the world we live in we have to reframe it
00:21:27 --> 00:21:33 for ourselves because i think it's so hard to feel like it matters or you're
00:21:33 --> 00:21:38 doing something meaningful if you don't have that immediate feedback loop yeah
00:21:38 --> 00:21:43 you know what when i was putting this question together i was thinking about
00:21:43 --> 00:21:45 Alanis Morissette, right?
00:21:46 --> 00:21:49 Because Alanis Morissette has been doing music for years.
00:21:50 --> 00:21:54 Then she had this one monster album, right?
00:21:55 --> 00:22:02 And ever since then, she's made other albums, but they haven't been as successful,
00:22:02 --> 00:22:05 but she's still doing what she wants to do.
00:22:06 --> 00:22:13 I think that's something that all people that are creatives have to have to balance and juggle.
00:22:13 --> 00:22:16 And, you know, I look at her as a perfect example. She doesn't care.
00:22:17 --> 00:22:19 She's going to do what she wants to do.
00:22:19 --> 00:22:26 And, you know, just that one moment in time, everybody got it and bought the album.
00:22:26 --> 00:22:34 But, you know, her whole intent was to produce the music she wanted to produce
00:22:34 --> 00:22:38 and, and stand on, on what she truly is.
00:22:38 --> 00:22:43 And I think that's, I think that's something that is missing in politics.
00:22:43 --> 00:22:50 I think in, in, in politics, everybody's trying to find that magic formula,
00:22:50 --> 00:22:55 that magic catchphrase to get them elected, but you know, they're not being
00:22:55 --> 00:22:57 genuine and not being themselves.
00:22:57 --> 00:23:01 And, and that turns people off. Do you feel the same way?
00:23:02 --> 00:23:05 Yeah, I think Twyla Tharp, who's a very famous choreographer,
00:23:05 --> 00:23:09 wrote this book called The Creative Habit. And she's been a choreographer for 70 years.
00:23:10 --> 00:23:13 And she talked about failures she had, successes she had.
00:23:13 --> 00:23:17 And she's like, being an artist is focusing on the process. Being a creative
00:23:17 --> 00:23:19 is process-oriented, not outcome-oriented.
00:23:19 --> 00:23:25 And in a span of 70 years, she's had multiple massive successes that have connected in huge ways.
00:23:25 --> 00:23:27 But knowing that failure or as a part
00:23:27 --> 00:23:32 of it. And so that really helped me contextualize how I wanted to work.
00:23:32 --> 00:23:38 And when we talk about politics, what is clear to me in this last go-round was
00:23:38 --> 00:23:45 that Tim Waltz struck a chord with a lot of people for the reason that he's just a nice,
00:23:45 --> 00:23:47 normal guy. He's not in it for fame.
00:23:48 --> 00:23:51 He's not in it for money. He's in it for impact.
00:23:52 --> 00:23:56 And that is the purest, truest way to show up.
00:23:57 --> 00:24:00 Because people can see when you're in it for the fame or the money or whatever
00:24:00 --> 00:24:01 else that politics has become.
00:24:02 --> 00:24:06 And I think we just want to get back to a place where the people that are in
00:24:06 --> 00:24:07 power care about the people.
00:24:08 --> 00:24:11 Yeah. Yeah. Bottom line. All right.
00:24:11 --> 00:24:17 So you also said that great leadership stays at the intersection of inner work
00:24:17 --> 00:24:19 and professional success.
00:24:20 --> 00:24:24 Why is inner work such an integral part of leadership, in your opinion?
00:24:25 --> 00:24:31 I do not think you can be a great leader and a holistic leader without doing the internal inquiry.
00:24:31 --> 00:24:35 This is a big, this is a big theme. I will, I'm working currently working on a book about this.
00:24:36 --> 00:24:42 I believe that your personal and your professional worlds are inherently linked. They are not binary.
00:24:42 --> 00:24:46 And in the world that we have grown up in, we are told they are separate things.
00:24:46 --> 00:24:49 But whatever is happening in your personal life is happening in your professional
00:24:49 --> 00:24:51 life. So for example, you're a terrible communicator.
00:24:51 --> 00:24:56 You don't like text your partner all the time. I bet you're not communicating
00:24:56 --> 00:25:01 well at work as well My other friend he told me recently, you know back when he started his career.
00:25:01 --> 00:25:05 He's like he's a cultural strategist His job is to think and he's like I live
00:25:05 --> 00:25:10 in my head when I started dating someone I couldn't live in my head anymore.
00:25:10 --> 00:25:14 I had because that's they don't know what's going on You have to learn to change
00:25:14 --> 00:25:17 that and that the same thing that he was doing at work Which was thinking a
00:25:17 --> 00:25:22 lot wasn't necessarily communicating what he was thinking And so you create
00:25:22 --> 00:25:24 the tension in both parts of your life.
00:25:24 --> 00:25:27 And when you do the internal inquiry, you're able to alleviate that friction
00:25:27 --> 00:25:33 and tension and show up as a thoughtful, empathetic person.
00:25:33 --> 00:25:37 And leadership needs, like great managers are thoughtful and empathetic and present.
00:25:38 --> 00:25:43 And that's harder and harder to do in a social media addicted world in a work
00:25:43 --> 00:25:47 addicted world But I think there's nothing more valuable than taking the time
00:25:47 --> 00:25:51 to be the to try and be the best version of yourself to try and show up in the
00:25:51 --> 00:25:55 way that you want to and I see a radical shift in.
00:25:55 --> 00:26:00 My work and my relationship to work since I started working on myself and every
00:26:00 --> 00:26:06 leader That I know that is happy and has a lot of money Has done this.
00:26:07 --> 00:26:15 You, a lot of the, when you talk about inner work, a lot of that is connected to spirituality.
00:26:15 --> 00:26:19 And you say this doesn't necessarily have to be religion, but, and I agree with that.
00:26:19 --> 00:26:24 I think that you have to have some kind of grounding, some kind of foundation
00:26:24 --> 00:26:28 in order to, to move forward.
00:26:28 --> 00:26:33 If you don't love yourself, you can't love anybody else. And like I said in
00:26:33 --> 00:26:40 another podcast, if you don't have a love for the people, then you can't lead them.
00:26:41 --> 00:26:49 So I think that's vitally important to have that connection within yourself.
00:26:50 --> 00:26:58 I just, you know, because I've been through depression, so I know how that detachment feels, right?
00:26:59 --> 00:27:07 And, you know, I tell people I'm recovering from depression because I know how
00:27:07 --> 00:27:15 I was before and how I am now is not the same as I was before I had to deal with depression.
00:27:16 --> 00:27:25 But I'm better than what I was, you know, and, you know, you can see the difference.
00:27:25 --> 00:27:32 But if I wasn't in tune with myself, then I ain't no telling where I would be right now.
00:27:32 --> 00:27:36 So I think that's I think that's vitally important for leadership.
00:27:36 --> 00:27:42 And so, you know, most of your focus in leadership has been in the entertainment industry.
00:27:42 --> 00:27:48 So I'm so I'm going to ask you to expand that and ask, how do we redefine leadership
00:27:48 --> 00:27:53 in America to reflect the America we live in?
00:27:54 --> 00:27:59 The leadership we have currently in this country reflects the values that we have upheld.
00:27:59 --> 00:28:05 And so what I want to keep driving home for people is if you want a compassionate,
00:28:05 --> 00:28:11 empathetic leader who prioritizes your well-being, then we have to start prioritizing other things.
00:28:12 --> 00:28:17 We cannot prioritize money, power, and fame as our gods and then expect to get
00:28:17 --> 00:28:18 a different type of leader.
00:28:18 --> 00:28:21 We got exactly the shadow side
00:28:21 --> 00:28:24 of america in a human being now as
00:28:24 --> 00:28:28 our in the highest highest position in this country and
00:28:28 --> 00:28:35 so as much as it breaks my heart every day and i have a really hard time trying
00:28:35 --> 00:28:39 to contextualize how this happened there's a part of me that also is very clear
00:28:39 --> 00:28:43 how this happened right so that's my my biggest thing is we have to start with
00:28:43 --> 00:28:45 our own little corner of the world.
00:28:45 --> 00:28:51 So you and I on this podcast, we are starting with our local community,
00:28:51 --> 00:28:54 and then hopefully that reverberates, and then hopefully that creates change.
00:28:54 --> 00:28:58 But I can't control everything. I can control how I show up,
00:28:58 --> 00:29:02 how I treat people, how I support people, how I inspire people.
00:29:03 --> 00:29:09 And that's, I think, the best place to start to get to a place of a more to
00:29:09 --> 00:29:12 the promise of what the Constitution promised us, okay?
00:29:12 --> 00:29:15 We know it has never lived fully in practice, but there's a promise there that
00:29:15 --> 00:29:18 we can live into if we choose to,
00:29:18 --> 00:29:23 but we have to be able to be more community-minded and think a little less about
00:29:23 --> 00:29:28 ourselves and our own financial well-being 24-7.
00:29:29 --> 00:29:35 Yeah, and you know, that reminds me of what Shakespeare said,
00:29:35 --> 00:29:41 and I guess it was Hamlet or whatever, when it talks about to thy own self be true, right?
00:29:42 --> 00:29:46 Because in America, whenever something really crazy happens,
00:29:46 --> 00:29:51 we always say, well, that's not America. That's not who we are.
00:29:53 --> 00:29:59 But from what I hear you saying, it's like, actually, it is how we are. And we need to fix that.
00:29:59 --> 00:30:02 There's two competing narratives, right? There's a narrative they told us,
00:30:02 --> 00:30:04 which was that all men are created equal.
00:30:04 --> 00:30:07 Well, that was written while those people owned other people.
00:30:07 --> 00:30:13 So that we know in the birth of this country, there was misalignment from day one.
00:30:13 --> 00:30:19 So you can't expect to get to an alignment if it was birthed upon poison soil.
00:30:19 --> 00:30:25 So we now have to figure out how to realign, get to a place where we're living
00:30:25 --> 00:30:29 the values that were promised, not the values that were happening in actuality.
00:30:30 --> 00:30:33 And that's going to take some work, right? That's the work we're in right now,
00:30:33 --> 00:30:39 is to overthrow the toxic narrative that is racist, sexist, xenophobic,
00:30:39 --> 00:30:43 and greedy, and oligarchical.
00:30:43 --> 00:30:46 Like that's not the world the story
00:30:46 --> 00:30:49 we were told but that is what happens when you don't address
00:30:49 --> 00:30:53 the wrongdoings that you were born upon yeah
00:30:53 --> 00:30:56 and see that i've always made the
00:30:56 --> 00:31:00 argument that we're talking about spirituality that
00:31:00 --> 00:31:03 when thomas jefferson wrote that right
00:31:03 --> 00:31:07 that was more divine intervention than
00:31:07 --> 00:31:11 it was him because you
00:31:11 --> 00:31:15 know he was the
00:31:15 --> 00:31:20 worst kind of slave holder i mean he was having sex with the slaves he didn't
00:31:20 --> 00:31:24 free him when he promised to free him all that stuff but he was the one that
00:31:24 --> 00:31:32 said that all men are created equal he wrote that down and so i think that a lot of times when,
00:31:32 --> 00:31:35 you know, our aspiration is.
00:31:36 --> 00:31:44 Is fall short because we fall victim to, like you said, money, power, sex, whatever.
00:31:44 --> 00:31:47 We fall victim to our weakness.
00:31:48 --> 00:31:55 And if we don't do some self-reflecting, we can never get to our aspiration, right?
00:31:56 --> 00:32:01 Yeah. So, I mean, that's perfect how you brought that in.
00:32:01 --> 00:32:05 Do you still struggle with uncertainty and possibility?
00:32:06 --> 00:32:09 All the time all the time all
00:32:09 --> 00:32:12 the time being creative is just having such a
00:32:12 --> 00:32:15 deep amount of faith and conviction that you believe
00:32:15 --> 00:32:24 everything will work eventually i'm a virgo by birth so that's my sun sign and
00:32:24 --> 00:32:29 for astrology people like that's my south node so the south node is what you're
00:32:29 --> 00:32:34 born with and basically what you're learning to relinquish so as a virgo i'm very type Bay planner,
00:32:35 --> 00:32:37 love to know what's going on, love to plan ahead.
00:32:37 --> 00:32:42 My North Node is in Pisces, which is about dreaming, about uncertainty,
00:32:43 --> 00:32:44 letting go, being fluid.
00:32:44 --> 00:32:50 And so I am actively trying to get a lot more comfortable with the not knowing.
00:32:50 --> 00:32:54 I've come a long way, but I certainly am still not a master.
00:32:55 --> 00:33:01 And last year was like a year of deep, deep existential crisis for me that I
00:33:01 --> 00:33:05 have a lot more faith today, but about seven months ago, I was having a lot
00:33:05 --> 00:33:08 of daily talks to the universe about my life.
00:33:10 --> 00:33:15 Well, you know, it's like, I think if you are conscientious,
00:33:16 --> 00:33:18 then all of us struggle with that.
00:33:18 --> 00:33:22 I think, you know, when you, when you're talking about.
00:33:23 --> 00:33:28 This, this, this political atmosphere. There's definitely a lot of uncertainty,
00:33:28 --> 00:33:35 uncertainty at my end, as far as like, are we even going to have an election in 2026?
00:33:35 --> 00:33:40 It's like, is my party going to rebound from this?
00:33:40 --> 00:33:43 Are they going to learn something from it? You know, all those kinds of things
00:33:43 --> 00:33:48 as a black person, you know, there's, there's always been uncertainty as far
00:33:48 --> 00:33:52 as when is the other shoe going to drop and they really going to do something to us.
00:33:52 --> 00:33:56 You know, They've been trying, but it's like, you know, we've always managed
00:33:56 --> 00:33:58 to find a way to fight back.
00:33:58 --> 00:34:04 And then that's where the possibility comes in, because we've seen over time
00:34:04 --> 00:34:09 that, you know, people who have been oppressed in this country have found a
00:34:09 --> 00:34:12 way to go forward and to make progress.
00:34:12 --> 00:34:17 And, you know, all of us are having children. So it's like, you know,
00:34:17 --> 00:34:24 with our children, we see the possibility and we we we pray that there's a better
00:34:24 --> 00:34:26 world for them than what we're dealing with.
00:34:26 --> 00:34:31 So I think people, if they're if they're really honest, they all struggle with that.
00:34:31 --> 00:34:39 I think it what's ingenuous, disingenuous to me is people in politics that don't acknowledge that.
00:34:40 --> 00:34:44 I think that, you know, there's some kind of, I don't know,
00:34:45 --> 00:34:49 so a persona that people feel they have to have is like, I have all the answers
00:34:49 --> 00:34:53 and, and, and, and, you know, everything's going to be fine if I get elected,
00:34:53 --> 00:34:57 but that's not really the truth. You know what I'm saying?
00:34:57 --> 00:35:03 And, you know, when things get hard, I think, I think people are scared to be
00:35:03 --> 00:35:05 that genuine. Do you, do you kind of feel that sense?
00:35:06 --> 00:35:12 Oh, yeah. I think truth telling is the most scary thing for people to do and receive.
00:35:12 --> 00:35:17 It's because it's not, you know, a sugar cube.
00:35:18 --> 00:35:21 It's something that may not be the answer you want to hear, and that can be uncomfortable.
00:35:22 --> 00:35:25 But we live in a time where people are not comfortable with discomfort,
00:35:25 --> 00:35:26 and they need to get more comfortable.
00:35:26 --> 00:35:31 That's part of one of the issues we have as a society, is our inability to be uncomfortable.
00:35:32 --> 00:35:38 And the one thing this presidency is doing is making me deeply uncomfortable every day.
00:35:38 --> 00:35:42 But I also have to be really present, right? Because every day we have such a chaotic leader.
00:35:43 --> 00:35:47 I don't know what's coming next. So you can't plan anything because it's so volatile.
00:35:48 --> 00:35:51 So you're just like waiting to see what's going to happen.
00:35:51 --> 00:35:55 Are we fully going to have a constitutional crisis in which then a lot of people
00:35:55 --> 00:35:57 will be deeply unsafe and harmed?
00:35:57 --> 00:36:00 And so then what do we do to get those people safe.
00:36:00 --> 00:36:04 But those are the things I'm looking at is like, when certain things happen,
00:36:04 --> 00:36:07 you know, it's time to go or, you know, other things have to kick in.
00:36:07 --> 00:36:11 And those are the things that I'm trying to track for, because I believe we
00:36:11 --> 00:36:13 have enough people on the right side of history in this country.
00:36:14 --> 00:36:21 I don't believe fascism will prevail, but I do think it's going to be a really uncomfortable.
00:36:23 --> 00:36:28 Way forward. And a lot of people are going to have to give up something,
00:36:29 --> 00:36:32 give up like some sort of security, some sort of comfort.
00:36:32 --> 00:36:36 And I don't think a lot of Americans have thought about that,
00:36:36 --> 00:36:39 are ready to do that. I just had a conversation with my mother about this, actually.
00:36:40 --> 00:36:43 And I've started to ask myself a lot of real questions, like,
00:36:43 --> 00:36:46 is if we really do have to fight for this, what are you willing to give up?
00:36:46 --> 00:36:47 Are you willing to be imprisoned?
00:36:48 --> 00:36:51 What are the steps you're willing to take? And I want more people to ask themselves those questions.
00:36:51 --> 00:36:57 Yeah, I heard you say that on your podcast that, you know,
00:36:57 --> 00:37:01 you have to make a conscientious decision because you were saying everybody
00:37:01 --> 00:37:08 that you've admired that's been involved in what we call good trouble have had
00:37:08 --> 00:37:09 to have that experience.
00:37:10 --> 00:37:15 I've had to serve some time. And I volunteer in prison. So I'm in prison every
00:37:15 --> 00:37:16 three months volunteering.
00:37:16 --> 00:37:20 So I'm very aware of what that environment is and looks like,
00:37:20 --> 00:37:25 at least in California, which is pretty, we're a little more on the rehabilitative
00:37:25 --> 00:37:27 side, still a long way to go.
00:37:28 --> 00:37:32 But you have to ask those questions. And certainly in this environment,
00:37:32 --> 00:37:35 where he's sending people is not like a California prison.
00:37:36 --> 00:37:38 It's a concentration camp, essentially.
00:37:39 --> 00:37:44 So we people need to start asking because I don't think this administration
00:37:44 --> 00:37:49 that leads by force, which is an unsustainable strategy, I don't believe that
00:37:49 --> 00:37:51 they are going to go down quietly.
00:37:52 --> 00:37:57 So we really have to figure out as people who care about protecting civil liberties.
00:37:58 --> 00:38:03 How we show up to save, you know, like there's a long way to go in America,
00:38:03 --> 00:38:06 but save at least the progress we've made thus far.
00:38:06 --> 00:38:10 All right. Who is the better futurist, you or Elon Musk and why?
00:38:11 --> 00:38:14 Elon Musk is not a futurist. Let's just clarify this for real quick, real quick.
00:38:14 --> 00:38:18 That man is not a futurist. He is not innovating. He is not inventing.
00:38:18 --> 00:38:25 He is buying people's companies that have created a technology that then he can just profit upon.
00:38:25 --> 00:38:28 As we've seen through Doge, he is also a terrible leader.
00:38:28 --> 00:38:32 He's employed people that are young, sycophantic.
00:38:32 --> 00:38:37 So he has complete control and agency. That is not leadership.
00:38:37 --> 00:38:42 Leadership is collaborative. Leadership is inventive. And let's just talk about process.
00:38:42 --> 00:38:46 For example, you're going to come in, do no diligence, eliminate things without
00:38:46 --> 00:38:50 weighing the pros and cons of the effects of what you're doing. Terrible leadership.
00:38:50 --> 00:38:54 You want to talk about efficiency and saving money. You need to make thoughtful
00:38:54 --> 00:38:56 decisions before you take action.
00:38:56 --> 00:39:01 And Silicon Valley has run on A move fast, break things mentality.
00:39:02 --> 00:39:06 Venture capital has supported these men in thinking that they are gods.
00:39:06 --> 00:39:09 They encourage them to lie. They encourage them to cheat.
00:39:09 --> 00:39:13 They encourage them to do whatever they have to do to get ahead and growth at all costs.
00:39:14 --> 00:39:19 I think we're seeing now the fallacy of venture capital. There's a reason so
00:39:19 --> 00:39:22 many founders have been taken down by the media.
00:39:22 --> 00:39:25 And when they get taken down, somehow I always get a call from a journalist
00:39:25 --> 00:39:27 because I'm one degree away.
00:39:27 --> 00:39:30 And the journalist will ask me questions and I say, you're not looking at the
00:39:30 --> 00:39:35 right thing. The problem is venture capital founders are the symptom of this
00:39:35 --> 00:39:36 bad disease of venture capital.
00:39:36 --> 00:39:40 And so when you've been trained and rewarded in that, what does your brain do?
00:39:40 --> 00:39:44 Your brain then codes that you can get away with a lot of things.
00:39:44 --> 00:39:46 And when you're supported in that, you keep doing it.
00:39:46 --> 00:39:51 And so I don't think Elon Musk had done anything other than been given a lot
00:39:51 --> 00:39:53 of opportunity and a lot of grace to succeed.
00:39:54 --> 00:39:58 And he had the right people around him, the right time, and the right amount of money.
00:39:58 --> 00:40:03 There are so many people I know that have given in the same depth and grace
00:40:03 --> 00:40:07 of opportunity, who would have done what he has done and more,
00:40:07 --> 00:40:11 but along the way, actually maybe solved world hunger.
00:40:12 --> 00:40:17 Right. I just, you know, and maybe I don't have the right mindset.
00:40:17 --> 00:40:20 That's probably why I wouldn't have that amount of money. But it seems to me
00:40:20 --> 00:40:26 that if the President of the United States said I could do anything I wanted
00:40:26 --> 00:40:29 to do and I was worth 400 million,
00:40:30 --> 00:40:32 400 billion dollars. Yeah.
00:40:33 --> 00:40:37 The last thing I want to do is tinker with the federal government.
00:40:37 --> 00:40:42 I, I'd moved to Puerto Rico. Hey, y'all need like a new grid.
00:40:42 --> 00:40:45 We can make that happen. We need to develop something.
00:40:45 --> 00:40:52 I would, I would go someplace where it's like, you know, people need some help.
00:40:52 --> 00:40:55 It's like Jeff Bezos is X. Yeah.
00:40:56 --> 00:40:59 Poor Mckenzie Scott i i say poor only
00:40:59 --> 00:41:02 in the in the struggle i i compare her struggle to
00:41:02 --> 00:41:05 Brewster's millions i don't know if you remember that movie Richard Pryor
00:41:05 --> 00:41:08 where he had to spend so much
00:41:08 --> 00:41:13 money in order to get more money and everything he did he was making money so
00:41:13 --> 00:41:16 he wasn't losing any money and that's that's kind of her situation she's giving
00:41:16 --> 00:41:23 money away but she's she's always making money every day based on you know her
00:41:23 --> 00:41:28 settlement yeah so it's like it's just a never-ending thing But she's trying to do the right thing.
00:41:28 --> 00:41:33 And it's like it would seem like to me that somebody of his magnitude,
00:41:33 --> 00:41:39 you know, would have would have taken advantage of that. But I guess,
00:41:39 --> 00:41:43 again, it goes back to not doing the inner work.
00:41:44 --> 00:41:47 If someone had done any of these men had done the inner work,
00:41:47 --> 00:41:50 Donald Trump, Elon Musk, J.D.
00:41:50 --> 00:41:56 Vance, Peter Thiel, whatever their wounding is, we're unfortunately reaping
00:41:56 --> 00:41:59 the consequences of it. Because if you are a content person,
00:41:59 --> 00:42:02 if you are a person, like what the question of what is enough, right?
00:42:02 --> 00:42:05 Like how much more money do you need? Like what is enough?
00:42:06 --> 00:42:09 The fact that they're not asking those questions, the fact that they're operating
00:42:09 --> 00:42:15 out of ego, out of force shows that they're wounded because hurt people hurt people, right?
00:42:15 --> 00:42:20 People that care about other people don't look to wield harm upon them.
00:42:20 --> 00:42:21 They don't look at people as expendable.
00:42:22 --> 00:42:25 And these men look at humans as productive or unproductive.
00:42:26 --> 00:42:30 And that is not what humans were ever here to do. We're here to be and to be
00:42:30 --> 00:42:31 in community and to create.
00:42:32 --> 00:42:36 Capitalism was thing we created. It's not what's in our nature necessarily.
00:42:36 --> 00:42:41 So it's all very bizarre to me because of the core, I mean, bizarre,
00:42:41 --> 00:42:42 not really bizarre, but it's just
00:42:42 --> 00:42:49 more makes me sad because whatever pain they're feeling can be healed.
00:42:50 --> 00:42:55 Yeah. Yeah. Yeah. If, if they, if they so desire it.
00:42:55 --> 00:42:58 So let's, let's, let's end this on a more uplifting note.
00:43:01 --> 00:43:08 Let's. What gives you the biggest thrill, podcasting, DJing, or public speaking?
00:43:09 --> 00:43:13 They all give me something a little bit different, but it's all the same thing.
00:43:13 --> 00:43:16 The reason I love podcasting, I guess it's a little different,
00:43:16 --> 00:43:19 but public speaking and DJing, what I love about
00:43:19 --> 00:43:22 being live in front of people is there's this beautiful energetic
00:43:22 --> 00:43:26 exchange that happens and there's nothing like performing
00:43:26 --> 00:43:29 live there is no substitute for that
00:43:29 --> 00:43:32 feedback in real time the magic
00:43:32 --> 00:43:35 that you can create the level of flow state you can achieve in like a short
00:43:35 --> 00:43:43 amount of time it is so so joyful for me podcasting for me the joy of it is
00:43:43 --> 00:43:49 two things one surfacing all the brilliant humans that exist in this world and giving them,
00:43:50 --> 00:43:54 having a conversation with them that I hope will plant seeds for people of how
00:43:54 --> 00:43:56 to live the blueprint that makes the most sense for them.
00:43:57 --> 00:44:01 It really has restored my faith in humanity this year.
00:44:02 --> 00:44:06 There's so many great, amazing people who give a damn.
00:44:06 --> 00:44:09 And that makes me feel so full.
00:44:09 --> 00:44:13 Like podcasting really feeds my soul, having a really intimate,
00:44:13 --> 00:44:19 special conversation that then we can share with a lot of people and hopefully inspire them.
00:44:19 --> 00:44:22 Like, that's the goal is to give people the tools to feel like they have agency
00:44:22 --> 00:44:26 over their own lives, feel like they aren't helpless, to feel like they can
00:44:26 --> 00:44:30 be the dreams that they have. That's what I want for all people.
00:44:31 --> 00:44:36 Amen. Amen. Because that is exactly the reason why I do what I do.
00:44:37 --> 00:44:43 You know, so it has been an honor to have that meaningful conversation with
00:44:43 --> 00:44:45 you, Denise Love Hewett.
00:44:47 --> 00:44:51 And, you know, like I tell every guest, you have an open invitation to come back.
00:44:52 --> 00:44:55 You know, if something's burning, you say, Erik, look, I got to get this off
00:44:55 --> 00:44:57 my chest. You know, we'll make that happen.
00:44:58 --> 00:45:03 So I want you to take advantage of that. But before we officially sign off,
00:45:04 --> 00:45:06 tell people how they can get in touch with you, how they can,
00:45:07 --> 00:45:11 you know, the name of the podcast that you do. Because you do too, right?
00:45:11 --> 00:45:16 Well, I've sunset Do the Work. So Do the Work was my first iteration,
00:45:16 --> 00:45:17 and it was about redefining leadership.
00:45:17 --> 00:45:23 And then I had a deep desire to expand that container to talk about new ways
00:45:23 --> 00:45:28 of redefining success, motherhood, romantic relationships, masculinity.
00:45:28 --> 00:45:30 I think we need a lot of new blueprints.
00:45:31 --> 00:45:34 The ones that have existed are not aging well. They've timed out.
00:45:35 --> 00:45:37 So I want to give people other pathways.
00:45:37 --> 00:45:40 So the new podcast is called Too Much with Denise Love Hewett.
00:45:41 --> 00:45:43 And that's what it's all about, redefining categories.
00:45:43 --> 00:45:49 And you can find me on TikTok, Instagram at Denise Love Hewett. It's E-T-T.
00:45:49 --> 00:45:54 And I would love to hear from you and feedback, advice, ideas. We're very open.
00:45:55 --> 00:46:00 Well, Denise, thank you again for coming on the podcast. I greatly appreciate you taking the time.
00:46:01 --> 00:46:05 I so appreciate this, Erik. It was such a joy to spend time with you.
00:46:05 --> 00:46:09 And thank you for all the good service you have done.
00:46:09 --> 00:46:12 All right, guys. And we're going to catch y'all on the other side.
00:46:12 --> 00:46:30 Music.
00:46:30 --> 00:46:38 All right, and we are back. And so now it is time for my next guest, Wendell Wallach.
00:46:39 --> 00:46:44 Wendell Wallach has an international reputation as an expert on the ethics and
00:46:44 --> 00:46:50 governance of emerging technologies, particularly artificial intelligence and biotechnologies.
00:46:50 --> 00:46:56 While semi-retired, he continues to consult and accept speaking engagements.
00:46:56 --> 00:47:04 From 2020 to 2024, he was the Carnegie Uhero Senior Fellow at the Carnegie Council
00:47:04 --> 00:47:06 for Ethics in International Affairs,
00:47:07 --> 00:47:16 where he founded and co-directed with Anja Kaspersen, the AI and Equality Initiative.
00:47:16 --> 00:47:22 He is also senior advisor to the Hastings Center and a scholar at the Yale University
00:47:22 --> 00:47:24 Interdisciplinary Center for
00:47:24 --> 00:47:29 Bioethics, where he chaired technology and ethics studies for 11 years.
00:47:29 --> 00:47:34 Wallach's latest book, A Primer on Emerging Technologies, is entitled A Dangerous
00:47:34 --> 00:47:38 Master, How to Keep Technology from Slipping Beyond Our Control.
00:47:38 --> 00:47:43 He co-authored with Colin Allen, Moral Machines, Teaching Robots Right from
00:47:43 --> 00:47:49 Wrong. The World Technology Award for Ethics was awarded to Wendell in 2014
00:47:49 --> 00:47:52 and for Journalism and Media in 2015.
00:47:53 --> 00:47:59 He was appointed Fulbright Research Chair at the University of Ottawa in 2015 to 2016.
00:48:00 --> 00:48:05 The World Economic Forum appointed Mr. Wallach co-chair of its Global Future
00:48:05 --> 00:48:10 Council on Technology Values and Policies for the 2016 to 2018 term.
00:48:11 --> 00:48:17 More recently, Wallach has been referred to as a godfather of AI ethics.
00:48:18 --> 00:48:22 Ladies and gentlemen, it is my distinct honor and privilege to have as a guest
00:48:22 --> 00:48:26 on this podcast, Wendell Wallach.
00:48:27 --> 00:48:37 Music.
00:48:37 --> 00:48:41 All right. Wendell Wallach. How are you doing, sir? Are you doing good?
00:48:42 --> 00:48:44 Doing fine. Thanks for having me.
00:48:45 --> 00:48:50 Now, did I pronounce your last name right? Because, you know, I'm a... I did good.
00:48:51 --> 00:48:56 Wallach. Okay. All right. Because, you know, I'm from the old school. You know, phonics.
00:48:57 --> 00:49:01 I had phonics when I was little. So I try to do phonetic stuff.
00:49:01 --> 00:49:07 Well, first of all... Yes, sir. Well, first of all, it's an honor to have you on.
00:49:07 --> 00:49:15 You are one of the more preeminent voices when it comes to ethics in modern
00:49:15 --> 00:49:17 or what they say, emerging technologies.
00:49:18 --> 00:49:23 And so for you to come on the podcast is really, really an honor for me.
00:49:23 --> 00:49:28 And as I stated before, you're one of the smartest people out here on the planet.
00:49:29 --> 00:49:34 And I greatly appreciate you sharing some of that knowledge with me and the audience.
00:49:36 --> 00:49:40 Well, thanks for your kind words. Yes, sir. All right. So I kind of break the
00:49:40 --> 00:49:43 ice a little bit and I do a couple of things.
00:49:43 --> 00:49:46 The first thing is I offer a quote to the guest.
00:49:47 --> 00:49:55 So this is your quote. The underlying purpose of AI is to allow wealth to access
00:49:55 --> 00:50:01 skill while removing the skilled, the ability to access wealth.
00:50:02 --> 00:50:04 That's interesting. I don't remember saying that, but.
00:50:05 --> 00:50:11 No, no, you didn't say it. You know, you get to this point where you've said
00:50:11 --> 00:50:16 so many things, you are sometimes surprised by what you have put out there.
00:50:17 --> 00:50:20 And I'm afraid that is true.
00:50:20 --> 00:50:28 It's true in the sense that AI is really serving the interests of an oligopoly
00:50:28 --> 00:50:34 and political leaders who are pretty intent on manipulating people's behavior
00:50:34 --> 00:50:36 and attitudes and beliefs.
00:50:36 --> 00:50:41 And it's not to say there aren't also a thousand good things that AI is doing.
00:50:41 --> 00:50:46 But there's a difference between there being a thousand good things and what
00:50:46 --> 00:50:52 the overall structure of the technology is and who it serves.
00:50:53 --> 00:50:58 So when an Elon Musk, for example, enters into the Trump government,
00:50:58 --> 00:51:03 he may want a lot of things, including fame and power.
00:51:03 --> 00:51:08 But he's particularly interested in all that data the U.S.
00:51:08 --> 00:51:15 Government has on each of us conglomerating that and the power it will give
00:51:15 --> 00:51:22 to AI and those who utilize or are behind the AI in their practices,
00:51:22 --> 00:51:27 an awful lot of which is directed at manipulating our behavior,
00:51:27 --> 00:51:31 again, for marketing and propaganda purposes. Yeah.
00:51:32 --> 00:51:38 All right. So now the the other icebreaker is I need you to give me a number between one and twenty.
00:51:40 --> 00:51:47 17. Okay. What's something about people who see the world differently than you
00:51:47 --> 00:51:49 that you've come to appreciate?
00:51:49 --> 00:51:54 Boy, that's a great question, but it's almost the other way around.
00:51:54 --> 00:52:01 I probably came out of an elitist background and that kind of background where
00:52:01 --> 00:52:04 you're special and your attitudes are the right ones.
00:52:05 --> 00:52:13 So, I mean, it's taken a lifetime of breaking some of those pretenses and biases
00:52:13 --> 00:52:18 down and just realizing that almost everybody on me has something to say to
00:52:18 --> 00:52:20 me that I need to assimilate.
00:52:21 --> 00:52:27 So, I mean, I think that's the broad answer to it. But the more a specific one,
00:52:27 --> 00:52:29 is I've traveled widely.
00:52:29 --> 00:52:36 So I've been exposed to people all over the world who just don't see the world the way I do.
00:52:36 --> 00:52:44 I'm surprised, for example, when I'm in China, how vigorously some of the most
00:52:44 --> 00:52:46 thoughtful Chinese defend their government.
00:52:47 --> 00:52:58 And I don't think it's a voice coming out of their feeling pressured or exploited or manipulated,
00:52:58 --> 00:53:02 but they really do, in some heartfelt way.
00:53:03 --> 00:53:08 Feel that their government is working relatively well for the Chinese people.
00:53:08 --> 00:53:16 That's particularly true since, for example, the present government led bringing
00:53:16 --> 00:53:22 500 million people, some say 800 million people, out of poverty.
00:53:23 --> 00:53:28 So, just an example of something that surprised me when I encountered it.
00:53:29 --> 00:53:34 Yeah, and that's a good perspective to know.
00:53:35 --> 00:53:39 So, how much has science fiction influenced your work?
00:53:39 --> 00:53:42 You talk about you came from a leader's background, but I want to go.
00:53:43 --> 00:53:48 You know, because I think about people like Isaac Asimov and,
00:53:48 --> 00:53:55 you know, a lot of others, Octavia Butler, you know, all those people that were
00:53:55 --> 00:53:58 kind of predicting where we were going.
00:53:59 --> 00:54:06 And so I just, you know, I usually don't get to ask a lot of scientists a lot of questions.
00:54:06 --> 00:54:17 So, you know, did science fiction kind of influence you going into this bioethicist?
00:54:18 --> 00:54:22 I can't even say if I'm saying it right. but you're a profession.
00:54:23 --> 00:54:26 Yeah, I mean, you kind of, right.
00:54:27 --> 00:54:31 I mean, how could science fiction not because all the projections people are
00:54:31 --> 00:54:37 making about the future are science fiction, and a lot of them showed up very early on.
00:54:37 --> 00:54:42 But though I read science fiction, you know, the classics in it,
00:54:42 --> 00:54:47 I'm not one of the big avid science fiction readers and followers,
00:54:47 --> 00:54:54 and I really got into this field around 2001 where it started to look like an
00:54:54 --> 00:54:59 awful lot of science fiction was going to become reality over the next 50 to 100 years.
00:55:00 --> 00:55:08 So I can't say that like many scientists, science fiction is much more prominent.
00:55:08 --> 00:55:11 It's why they became scientists.
00:55:12 --> 00:55:17 And certainly something like Asimov's three laws have played a role in,
00:55:17 --> 00:55:22 I mean, the first few pages of Moral Machines Teaching Robots Right from Wrong,
00:55:22 --> 00:55:28 one of my books that I co-authored with Colin Allen, where we look at the prospects
00:55:28 --> 00:55:33 for implementing sensitivity to moral considerations in robots and computers
00:55:33 --> 00:55:36 and the ability to factor those into their choices and actions,
00:55:37 --> 00:55:39 we sort of mapped that field out.
00:55:40 --> 00:55:44 But sort of the first thing is we heard from everyone was, well,
00:55:44 --> 00:55:50 didn't Asimov solve that problem with his three and then later four laws of robots?
00:55:50 --> 00:55:56 And what people seem to forget is Asimov laid out these perfectly logical,
00:55:56 --> 00:56:02 hierarchically arranged principles, and then he wrote roughly 80 stories,
00:56:02 --> 00:56:08 more than 80 stories, but not all of them, point out why these three laws wouldn't work.
00:56:08 --> 00:56:16 So why three simple ethical principles are not adequate to either have sensitivity
00:56:16 --> 00:56:20 to ethical considerations or to know what's the right thing to do.
00:56:21 --> 00:56:25 Yeah. So if it wasn't science fiction per se, what inspired you to do this?
00:56:25 --> 00:56:29 Because you've been doing this for at least over 15 years.
00:56:29 --> 00:56:35 You know, 15 years ago, nobody was really talking about AI, at least not in
00:56:35 --> 00:56:37 casual conversation like we do now.
00:56:38 --> 00:56:42 So what was the trigger to get you involved in this stuff?
00:56:42 --> 00:56:48 Well, I mean, for me, it probably starts much earlier, and it is on the ethical side.
00:56:49 --> 00:56:56 When I was 17 years old, two local ministers came to my house early in the morning
00:56:56 --> 00:56:59 to take me on the march on Washington in 1963.
00:57:00 --> 00:57:07 So I was in the early part of that kind of folk movement, civil rights,
00:57:07 --> 00:57:12 anti-Vietnam War generation, and I was very much an activist,
00:57:12 --> 00:57:19 you know, having spent some time in prison and in Alabama during the Selma-Montgomery
00:57:19 --> 00:57:23 marches, you know, and that followed me.
00:57:23 --> 00:57:30 And I also got very interested in Eastern religions and started thinking about
00:57:30 --> 00:57:38 why ethics didn't work very well and what did it need. So when I encountered.
00:57:39 --> 00:57:44 With a relatively small culture, a few hundred people worldwide,
00:57:44 --> 00:57:50 a transhumanist and others who were waxing poetic about the benefits they could
00:57:50 --> 00:57:53 get from technological enhancements and other things,
00:57:54 --> 00:57:57 my antennas were already out there for, well, what could go wrong?
00:57:58 --> 00:58:02 And, of course, some of them were already pointing out what would go wrong.
00:58:02 --> 00:58:09 And Eliezer Yukowsky was already telling people that the singularity was coming
00:58:09 --> 00:58:13 and robots were going to basically decimate humans.
00:58:13 --> 00:58:20 And AI was going to basically decimate humans. So that got me into the field.
00:58:20 --> 00:58:26 I very early on became the chair of the technology and ethics study group at Yale.
00:58:26 --> 00:58:32 But it was such a vast field that I needed something to get my hands on.
00:58:32 --> 00:58:37 And this project that I did the book together with Colin Allen,
00:58:37 --> 00:58:41 and I did some earlier research with a woman, Eva Schmidt,
00:58:42 --> 00:58:46 on whether you can implement moral decision-making faculties and robots,
00:58:46 --> 00:58:53 sort of helped me focus a little bit and get back to my primary interest,
00:58:53 --> 00:58:59 even going back as a kid, which is, can we know what is right and good? And if so, how?
00:58:59 --> 00:59:07 And so the book became a sneaky way of having this ethical inquiry in the context
00:59:07 --> 00:59:12 of technology, which made it much more accessible for many people,
00:59:12 --> 00:59:17 though probably it was more successful in its subtext,
00:59:17 --> 00:59:22 meaning it's a primer on ethics, what works and what doesn't work,
00:59:22 --> 00:59:27 than it was as a book that really gave us a clear pathway.
00:59:27 --> 00:59:32 To ensure that future technologies would act in an ethical way.
00:59:33 --> 00:59:40 And I guess now we're at the stage where I'm bewildered at the extent to which
00:59:40 --> 00:59:47 we indulge essentially unsafe technologies with limited safeguards.
00:59:48 --> 00:59:54 Yeah. So that perfectly segues, well, I'd say perfectly, into my next question.
00:59:54 --> 01:00:00 One of the biggest challenges in artificial intelligence, as highlighted by Dr.
01:00:00 --> 01:00:06 Joy Boulamwini, has been facial recognition of darker-skinned humans and speech
01:00:06 --> 01:00:09 recognition of African-American vernaculars.
01:00:10 --> 01:00:15 How ethical can AI be if it perpetuates racial bias?
01:00:16 --> 01:00:19 It can't. That's pretty simple.
01:00:19 --> 01:00:26 It can't, and it can't because it's being trained on not only—I don't care about skin color.
01:00:27 --> 01:00:29 I don't care if it shows that you're green or purple.
01:00:29 --> 01:00:35 That's not the problem. The problem is it's being trained on this vast body
01:00:35 --> 01:00:42 of historical information and more recent information, which is intrinsically racist.
01:00:42 --> 01:00:45 So now you have this problem that you
01:00:45 --> 01:00:52 have a skin color and you can match it with both historical and more recent
01:00:52 --> 01:00:59 data where people said skin color equates to or has a very strong connection
01:00:59 --> 01:01:02 to because that's how AI works,
01:01:03 --> 01:01:04 basically. It's a connectionist network.
01:01:05 --> 01:01:07 It's creating strength.
01:01:08 --> 01:01:13 It's quantifying the relationship between one characteristic and another characteristic.
01:01:13 --> 01:01:21 So there's no way that the identification of skin color is going to be a positive
01:01:21 --> 01:01:28 thing if you have backlogs of racist information to draw on to interpret what
01:01:28 --> 01:01:31 the skin color means. Yeah.
01:01:33 --> 01:01:38 So you have... I'm credited for being one of the early people to...
01:01:38 --> 01:01:44 I mean, people like myself were talking about bias long before people like Joy
01:01:44 --> 01:01:48 came along and brought some empirical evidence to bear.
01:01:49 --> 01:01:50 Yeah, yeah. And...
01:01:52 --> 01:01:56 Yeah, so she really has kind of taken it to a new level.
01:01:57 --> 01:02:01 And, you know, a lot of times when you look back at history,
01:02:01 --> 01:02:06 when they used to talk about the bell curve and all that stuff,
01:02:07 --> 01:02:12 we didn't have people in positions really to counter that.
01:02:13 --> 01:02:18 And I think now in this day and age, we do have people to counter the misnomers
01:02:18 --> 01:02:24 and so on. And so I hope that more people like Dr. Joy get engaged.
01:02:25 --> 01:02:29 You're definitely one of those people, but, you know, more people get engaged
01:02:29 --> 01:02:34 in making sure that the information that's being programmed into artificial
01:02:34 --> 01:02:37 intelligence is more balanced and accurate.
01:02:38 --> 01:02:42 I'm one of the old, I mean, I'm one of the older people, so I'm not going to
01:02:42 --> 01:02:46 be able to be in this fight for all that much longer. You know,
01:02:46 --> 01:02:50 we'll see what nature or God grant me.
01:02:50 --> 01:03:01 But what's thrilled me about the past 25 years is it's not just race,
01:03:01 --> 01:03:03 but gender in particular.
01:03:04 --> 01:03:08 I mean, it used to be that most panels, if they had a woman,
01:03:09 --> 01:03:13 she was a token woman, you know, on the panel.
01:03:13 --> 01:03:20 And one of my closest colleagues, Anya Kass Pearson, was one of those constant
01:03:20 --> 01:03:25 token female on panels. I mean, it was sort of obnoxious.
01:03:25 --> 01:03:29 And Anya did a lot herself to nurture other women coming along.
01:03:29 --> 01:03:35 But now, it's not even just a question of a token woman on the panel.
01:03:35 --> 01:03:36 They are brilliant.
01:03:37 --> 01:03:41 They've had the experience now. They've had the opportunity.
01:03:41 --> 01:03:47 And though sometimes I think we're leaving the world in a worse shape than it
01:03:47 --> 01:03:53 was when I came into it, and that was the big aspiration of us 60s types,
01:03:54 --> 01:04:02 now I look around and the one thing I see is all the brilliant women and African
01:04:02 --> 01:04:08 and Asian and other peoples who know their stuff.
01:04:09 --> 01:04:13 They aren't tokens. They are the leaders and that the next generation will have
01:04:13 --> 01:04:16 that leadership. That speaks well.
01:04:17 --> 01:04:22 Yeah. So you've spoken before to UN concerning military use of AI.
01:04:23 --> 01:04:26 What are your biggest ethical concerns in that regard?
01:04:27 --> 01:04:32 Machines should not be making life and death decisions about humans. Point blank.
01:04:33 --> 01:04:36 Only humans get to make life and death decisions about you.
01:04:37 --> 01:04:43 And that begs the question of when the machine is an extension of human will and intention.
01:04:43 --> 01:04:49 But I think even if the machine is an extension of human will and intention, it's not.
01:04:50 --> 01:04:58 There needs to be a responsible reflection on the output of the machine before
01:04:58 --> 01:05:03 humans act on taking a life based on machine output.
01:05:03 --> 01:05:09 And I don't think that's just a simple question of accuracy or who's better
01:05:09 --> 01:05:10 at the decision-making.
01:05:10 --> 01:05:19 I think that's a more moral question about human integrity, conscience.
01:05:20 --> 01:05:30 Willingness to give due respect to other humans, regardless of whether they agree with them.
01:05:30 --> 01:05:36 And unfortunately, now we're getting into these technologies like Lavender and
01:05:36 --> 01:05:44 Where's Daddy and so forth that can attract tens of thousands of people at once.
01:05:44 --> 01:05:48 And if, for example, one of those people goes into a building,
01:05:49 --> 01:05:55 match it with that, with these other technologies identifying what's in various
01:05:55 --> 01:06:02 buildings, and give a command to humans to strike that building.
01:06:02 --> 01:06:06 And it would be one thing if the humans then took, let's say,
01:06:07 --> 01:06:13 even 10 minutes to get other information and see if they know how many civilians
01:06:13 --> 01:06:18 were in that building, and noncombatants or others.
01:06:18 --> 01:06:24 But from what we're learning, that's not happening. What we're learning is the
01:06:24 --> 01:06:26 strike orders become pretty perfunctory.
01:06:27 --> 01:06:31 And to me, that's one of the great tragedies for humanity.
01:06:31 --> 01:06:36 I mean, that means we're moving into the use of weaponry that,
01:06:36 --> 01:06:41 I've said this before, but basically that Hitler would have loved,
01:06:41 --> 01:06:45 and any modern-day genocidal leader will love.
01:06:46 --> 01:06:53 Because one of the criticisms that's out there is that in Israel's response
01:06:53 --> 01:07:03 to October 7th is that they're using a lot of AI to make strikes in Gaza and all that.
01:07:03 --> 01:07:11 And, you know, a lot of people, it's not been confirmed, but many people believe
01:07:11 --> 01:07:15 that that's what they've been using to… Well,
01:07:15 --> 01:07:20 we know they do have these weapons like lavender and where's daddy and so forth.
01:07:20 --> 01:07:25 It's pretty well confirmed. I wouldn't say it's absolute.
01:07:25 --> 01:07:30 You know, there have been only a few articles written by people who have given
01:07:30 --> 01:07:35 testimony. And there's a lot of the details we don't have.
01:07:35 --> 01:07:41 But it looks pretty clear that that's taking place. And it doesn't matter.
01:07:41 --> 01:07:46 Civilians are still being targeted. This is a violation of international humanitarian law.
01:07:47 --> 01:07:54 I fully support the Israelis' right to strike back,
01:07:55 --> 01:08:03 given the horrific nature of what happened in October, but there has to be some restraint.
01:08:03 --> 01:08:09 And now we are seeing even the Israeli people demonstrate against their government
01:08:09 --> 01:08:13 and say there is no need for the continuation of this war.
01:08:13 --> 01:08:20 This is more about what serves the Netanyahu government than what serves Israel.
01:08:21 --> 01:08:27 Yeah. Yeah. No argument for me on that, whether we were talking about AI or anything else.
01:08:28 --> 01:08:35 So since this is a political show, talk about the need for ethics in political campaigning.
01:08:35 --> 01:08:45 Because one of the concerns that, you know, has come up early is how commercials are made, right?
01:08:45 --> 01:08:51 It's like they'll take my voice and turn it around and make it seem like I said,
01:08:52 --> 01:08:55 you know, we should all eat dogs and eat cats.
01:08:55 --> 01:09:00 And, you know, I may have been talking about, you know, the need for more animal
01:09:00 --> 01:09:03 shelters, but somehow, some way they splice it.
01:09:03 --> 01:09:07 And, of course, People have been up in arms about that.
01:09:07 --> 01:09:15 But just talk to me in detail about what are your concerns and how can any suggestions
01:09:15 --> 01:09:17 of how we can regulate that?
01:09:18 --> 01:09:26 Yeah, that's not an easy question to answer in terms of how we can regulate it.
01:09:26 --> 01:09:33 But what's going on? I mean, you could always splice and edit and put together
01:09:33 --> 01:09:39 a little piece of film that made somebody say something that they didn't say.
01:09:39 --> 01:09:43 But we now can do that in an instant. We can create deep fakes.
01:09:43 --> 01:09:50 We can have people give long speeches where their lips are synchronized with
01:09:50 --> 01:09:54 their words, and they didn't say it.
01:09:54 --> 01:10:01 So the technologies for disinformation and misinformation have grown exponentially,
01:10:02 --> 01:10:07 and it's very hard to track what you're getting and what you aren't getting.
01:10:08 --> 01:10:14 And the old way, at least the way I used to manage this was that if I trusted
01:10:14 --> 01:10:19 somebody, I tended to go along with their opinions on things that I didn't have
01:10:19 --> 01:10:20 time to research myself.
01:10:21 --> 01:10:24 But now you have campaigns of lying
01:10:24 --> 01:10:34 or distortions or selective picking of information so that it's becoming increasingly
01:10:34 --> 01:10:38 difficult for people to know what is and what isn't true without doing a lot
01:10:38 --> 01:10:43 of work themselves or without being sure they have really.
01:10:44 --> 01:10:50 Trustworthy people to turn to. And the problem is even those we trust the most
01:10:50 --> 01:10:56 doesn't mean we go along with all their beliefs or understand what they will and will not do.
01:10:56 --> 01:11:05 So we're moving into this world where trust is at a premium and AI,
01:11:05 --> 01:11:12 not only in disinformation, but even in real information can still be used just
01:11:12 --> 01:11:14 merely to manipulate behavior.
01:11:15 --> 01:11:22 So consider that you're in the mall tomorrow, and suddenly music comes blasting
01:11:22 --> 01:11:29 out of one of the stores, which just happens to be one of your favorite pieces of music,
01:11:29 --> 01:11:31 get you dancing or whatever, you know?
01:11:31 --> 01:11:37 And lo and behold, that store happens to be selling something that you searched
01:11:37 --> 01:11:40 for on the internet the day before.
01:11:41 --> 01:11:46 That's the kind of world we're moving into. And that's a pretty hairy world.
01:11:46 --> 01:11:51 It's no longer that we're training the machines to be as intelligent as us.
01:11:51 --> 01:11:59 The machines are training us to be as dumb as they want, because in general,
01:11:59 --> 01:12:03 the machines don't want anything, but those behind those machines,
01:12:03 --> 01:12:10 those marketing products, It's those trying to persuade us politically want.
01:12:10 --> 01:12:14 And what are they getting out of it? They're getting all kinds of power.
01:12:15 --> 01:12:21 Elon Musk spent chump change, $278 million to buy the executive.
01:12:22 --> 01:12:28 When the tech oligopoly had for years trying to buy at least the legislature
01:12:28 --> 01:12:34 to ensure that they didn't enact the laws that got in the way of their developments.
01:12:34 --> 01:12:39 I mean, now we're moved into a universe where...
01:12:40 --> 01:12:44 Presumably, AI will be generating trillions and trillions of dollars.
01:12:44 --> 01:12:52 And where is that money going? It's all going to those who hold stock in the
01:12:52 --> 01:12:55 leading companies, or at least 90% of it is.
01:12:55 --> 01:13:03 I happened to be at a lunch at the UN, at the ITU, which is International Telecommunication
01:13:03 --> 01:13:06 Union, and that's part of the UN.
01:13:06 --> 01:13:14 And they had basically the representatives to the UN in Geneva from all of the
01:13:14 --> 01:13:15 countries at this lunch.
01:13:16 --> 01:13:23 And the keynote speaker was a vice president of Amazon Web Services and an official from Microsoft.
01:13:24 --> 01:13:29 And the guy from Amazon Web Services, after he had waxed poetic about all the
01:13:29 --> 01:13:34 good things AI was going to do for us, he talked about the trillions of dollars it would generate.
01:13:34 --> 01:13:42 And the Microsoft guy got up and he talked about how AI was helping companies
01:13:42 --> 01:13:44 in their energy efficiency.
01:13:45 --> 01:13:48 And I got up afterwards and I just slammed them both.
01:13:48 --> 01:13:52 I mean, I said, don't tell us about the trillions of dollars AI is going to
01:13:52 --> 01:13:58 make when almost none of it is going to go to the two-thirds of the world represented
01:13:58 --> 01:14:02 by most of these ambassadors here in the room.
01:14:02 --> 01:14:08 And don't talk to us about AI contributing to energy efficiency when we all
01:14:08 --> 01:14:11 know AI has become one of the biggest consumers of energy.
01:14:12 --> 01:14:17 So this is the world we're in. We're in a world where narratives are being written
01:14:17 --> 01:14:22 for us that don't necessarily meet our interests.
01:14:22 --> 01:14:28 And the problem with both of those examples is just looking at a little piece of the problem.
01:14:28 --> 01:14:33 We can always talk about anecdotes and that information that supports my viewpoint.
01:14:34 --> 01:14:43 But if we don't develop the moral intelligence and the self-understanding to
01:14:43 --> 01:14:48 counteract this piecemeal analysis of the problems at hand,
01:14:49 --> 01:14:55 we certainly aren't going to survive in any meaningful way the generation of
01:14:55 --> 01:14:57 AI and other emerging technologies.
01:14:59 --> 01:15:04 Somebody asked you a question I asked another guest. Who is a better futurist,
01:15:04 --> 01:15:07 you or Elon Musk, and why?
01:15:09 --> 01:15:12 I don't know how to answer that question.
01:15:13 --> 01:15:17 You know, I mean, the problem is none of us is everything.
01:15:17 --> 01:15:20 None of us really knows what the future is going to be.
01:15:20 --> 01:15:26 There's some heavy predictions out there. We all select information that feeds
01:15:26 --> 01:15:28 our own wishes and desires.
01:15:28 --> 01:15:34 I just happen to be the guy who's now made a career of being the gadfly,
01:15:35 --> 01:15:38 of underscoring for years what's not getting attention.
01:15:39 --> 01:15:47 Elon Musk is, you know, building the companies that provide the technologies
01:15:47 --> 01:15:49 of the future that he wants, you know.
01:15:49 --> 01:15:58 So, I think most people would pick Elon Musk, or at least would have until five months ago.
01:16:00 --> 01:16:06 If not a bad futurist, now that they've got to know him, they might think better of that.
01:16:06 --> 01:16:14 I think if it comes to the good of humanity, people like me are head and shoulders above Elon Musk.
01:16:15 --> 01:16:21 But if it comes to actually the realization of futuristic gadgets, yes, of course, Elon.
01:16:22 --> 01:16:28 Yeah, yeah, I got you. And, you know, that brings into the perspective,
01:16:29 --> 01:16:31 right? Because you're going to have people.
01:16:31 --> 01:16:34 Can I add one thing to that? Go ahead.
01:16:35 --> 01:16:40 I may have given myself a little praise, but if it's a question of who's winning,
01:16:41 --> 01:16:43 clearly it's the attack oligopoly.
01:16:44 --> 01:16:49 I mean, those of us who are trying to underscore what requires attention,
01:16:50 --> 01:16:55 the things that require attention aren't getting anywhere near the attention they need.
01:16:56 --> 01:16:59 Yeah. And, you know, and that's what I was going to say about the perspective.
01:17:00 --> 01:17:04 You know, I'm a I'm a big science fiction guy. And, you know,
01:17:04 --> 01:17:07 one of my favorites is is the Star Wars franchise.
01:17:08 --> 01:17:13 And it's like, you know, the Empire has all the cool stuff.
01:17:13 --> 01:17:19 They got the Death Star and all the, you know, all that every all the all the things.
01:17:19 --> 01:17:24 And, you know, the rebels don't have as much technology, but they but they have
01:17:24 --> 01:17:27 the moral compass. to challenge it.
01:17:28 --> 01:17:32 And, you know, when we watch the movies, we want the moral compass to win,
01:17:33 --> 01:17:37 but in real life, eh, not so much.
01:17:37 --> 01:17:42 So, you know, I appreciate how introspective you were with that answer because,
01:17:42 --> 01:17:46 you know, a lot of us, like I said, we like the cool stuff.
01:17:46 --> 01:17:50 The fact that we're doing this interview via technology, right?
01:17:50 --> 01:17:58 That's cool. But, you know, I appreciate you because you remind us,
01:17:58 --> 01:18:01 it's a biblical principle,
01:18:01 --> 01:18:05 you know, what is it to gain the whole world if you lose your soul,
01:18:05 --> 01:18:07 right? If you lose that morality.
01:18:08 --> 01:18:17 So I appreciate gadflies like you who remind us, remind us that there has to be a balance.
01:18:18 --> 01:18:21 I think there's also a thing in the kind of self-understanding we're going to
01:18:21 --> 01:18:25 need to be able to battle off everything that's trying to manipulate us,
01:18:25 --> 01:18:31 not battle off, but just diffuse its impact upon us, is probably a revival of
01:18:31 --> 01:18:35 what used to be called virtue ethics or, you know,
01:18:35 --> 01:18:37 or not losing your soul.
01:18:38 --> 01:18:43 The recognition that that's pointing to something, whether we have souls or
01:18:43 --> 01:18:45 not, I'm not concerned with that.
01:18:46 --> 01:18:52 You know, people will debate that. But there's something intrinsic to us that
01:18:52 --> 01:18:55 functions like a moral barometer,
01:18:55 --> 01:19:03 even if there is no such thing as a moral compass or anything like that.
01:19:03 --> 01:19:07 And I call that the need for a silent ethic.
01:19:08 --> 01:19:14 I think we need that kind of internal re-engagement with life.
01:19:16 --> 01:19:22 And we need to complement that with a kind of ethical analysis.
01:19:22 --> 01:19:26 It doesn't only look at what the greatest good for the greatest number is,
01:19:26 --> 01:19:33 but also looks at the price we pay for each option we choose.
01:19:33 --> 01:19:35 So what goes wrong or who loses?
01:19:36 --> 01:19:44 My own feeling is if we don't also speak to the detriments of the choice we
01:19:44 --> 01:19:48 finally make and ameliorate those, lessen those.
01:19:48 --> 01:19:52 There's no way we can call our action the world.
01:19:53 --> 01:20:02 Yeah. Well, Wendell Wallach, the time that we have is definitely not as appropriate
01:20:02 --> 01:20:06 to cover everything as intensely as we should.
01:20:06 --> 01:20:09 But I do appreciate the time that you gave me.
01:20:09 --> 01:20:14 And in this brief time that I've had a chance to talk with you,
01:20:14 --> 01:20:17 I understand why people hold you in such high regard.
01:20:18 --> 01:20:22 And again, I am honored that you came on the podcast And I am honored that you
01:20:22 --> 01:20:26 are doing the work That you have committed yourself to do,
01:20:27 --> 01:20:30 You stated you don't know how much longer you have But
01:20:30 --> 01:20:36 I will say that with utmost confidence That with time you do have left That
01:20:36 --> 01:20:41 you're going to maximize it And continue to educate us And continue to keep
01:20:41 --> 01:20:45 us balanced and on the right path So thank you for coming on the podcast And
01:20:45 --> 01:20:48 thank you for doing what you're doing Thank you ever so much.
01:20:48 --> 01:20:56 You know, any opportunity to speak out and have one more person start to think about these things.
01:20:58 --> 01:21:06 I really appreciate that you have a podcast where you can bring this vast body
01:21:06 --> 01:21:10 of illumination to the difficult political challenges we're covering.
01:21:10 --> 01:21:14 So if people, before you go, if people want to reach out to you,
01:21:14 --> 01:21:18 If people want to get in touch with you and stuff, how can they do that?
01:21:18 --> 01:21:24 Well, they can do that through my email, wwallack.com,
01:21:24 --> 01:21:35 wwallack, excuse me, at comcast.net or wendell.wallack at yale.edu or through LinkedIn.
01:21:35 --> 01:21:37 Those are kind of ways to reach me.
01:21:38 --> 01:21:43 And unfortunately, I'm meeting too many people and can't always respond.
01:21:43 --> 01:21:49 So if there's a specific reason or, you know, they say they've listened to this
01:21:49 --> 01:21:54 podcast, then, you know, then that's just helpful for me to identify with me.
01:21:55 --> 01:21:58 Yes, sir. All right. Yes, sir.
01:21:58 --> 01:22:02 Well, again, thank you for coming on and appreciate you taking the time.
01:22:03 --> 01:22:05 All right, guys. We're going to catch you all on the other side.
01:22:06 --> 01:22:17 Music.
01:22:17 --> 01:22:24 All right, and we are back. So I want to thank Denise Love Hewitt and Wendell
01:22:24 --> 01:22:27 Wallach for taking the time out to come on the podcast.
01:22:28 --> 01:22:36 It is really, really an honor to have people like that come on,
01:22:36 --> 01:22:42 not just these two guests, but all the guests that I've been privileged to have on the podcast.
01:22:42 --> 01:22:48 But in this particular episode, to have both Denise and Wendell come on and
01:22:48 --> 01:22:55 to be so gracious with their time and their expertise and their commitment to what they're doing,
01:22:56 --> 01:23:03 because they've got other things to do, but to set aside time to talk to us
01:23:03 --> 01:23:10 and kind of explain why they're doing what they're doing and how it relates to all of us.
01:23:10 --> 01:23:15 But particularly those of us in African-American community, you know,
01:23:15 --> 01:23:19 it's in the political realm. It's really, really important.
01:23:19 --> 01:23:23 So I thank them again for coming on.
01:23:24 --> 01:23:31 So I want to talk just real quickly because I'm trying to really control the motions per se.
01:23:31 --> 01:23:41 But, you know, dealing with this situation with Kilmar Abrego-Garcee.
01:23:42 --> 01:23:47 So some of you may know that at one time in my life, I was a paralegal for the
01:23:47 --> 01:23:49 Mississippi Immigrants Rights Alliance.
01:23:50 --> 01:23:54 Shout out to Bill and Patricia for the work that they've been doing all these
01:23:54 --> 01:23:59 years to not only help people get
01:23:59 --> 01:24:03 all their paperwork, And it's a lot of paperwork that people have to do,
01:24:03 --> 01:24:06 especially those folks who are seeking asylum.
01:24:06 --> 01:24:10 It's a lot of paperwork. There is money involved, right?
01:24:11 --> 01:24:20 And so it's just a lot of work that needs to be done to help people to go through
01:24:20 --> 01:24:23 the process legally in the United States,
01:24:23 --> 01:24:27 to become either a permanent resident or a citizen, or even sometimes just to
01:24:27 --> 01:24:29 get a student or work visa, right?
01:24:30 --> 01:24:36 There's a lot of work to be done, and there's a lot of flaws in the system that should be addressed.
01:24:36 --> 01:24:41 But, you know, right now we've got an administration that basically just wants
01:24:41 --> 01:24:43 to kick everybody out that's brown and black.
01:24:46 --> 01:24:51 Or any other shade to the rainbow, to be honest. And now this president is talking
01:24:51 --> 01:24:56 about deporting Americans, but we'll get into that in a minute. Yeah.
01:24:57 --> 01:25:03 You know, so anytime there's immigration issues because of that time I spent,
01:25:03 --> 01:25:07 I was really, really made sensitive to that.
01:25:07 --> 01:25:10 It was one thing to be a politician and to speak to groups and talk about issues
01:25:10 --> 01:25:16 and deal with bills on the floor, but to actually do the work and really get
01:25:16 --> 01:25:21 to know people who are trying to do the right thing and watch all the hoops
01:25:21 --> 01:25:22 that they have to jump through.
01:25:22 --> 01:25:28 And to celebrate people when they get their permanent residency or their citizenship.
01:25:28 --> 01:25:34 You know, it's ups and downs. It's triumphs and tragedies in that process.
01:25:35 --> 01:25:41 So, again, like I said, there's some work that needs to be done to make it simpler,
01:25:42 --> 01:25:44 to make it processed better.
01:25:45 --> 01:25:51 But we're not going to do it just being hateful, spiteful, and reckless.
01:25:53 --> 01:25:57 And the case with Mr. Abrego Garcia is that kind of case.
01:25:57 --> 01:26:08 So if you understand all this, you know, he was detained.
01:26:09 --> 01:26:18 And his situation was that he is not a legal immigrant. He is not a documented immigrant.
01:26:19 --> 01:26:28 His situation was he fled his country, El Salvador, and he was,
01:26:28 --> 01:26:29 you know, being threatened,
01:26:30 --> 01:26:34 you know, either trying to be initiated in a gang or, you know,
01:26:34 --> 01:26:36 threats against his family or both.
01:26:36 --> 01:26:39 Got out of El Salvador, got to the United States.
01:26:40 --> 01:26:47 And first time he was detained, once he explained his situation,
01:26:47 --> 01:26:51 he was given an exemption to stay.
01:26:53 --> 01:26:59 That he was given an order not to be sent back to his home country.
01:26:59 --> 01:27:07 So in the process of that, he got married, started a family, was been working.
01:27:08 --> 01:27:14 Don't know any details about any further paperwork he may have done.
01:27:14 --> 01:27:19 He may have been in the process of trying to get in asylum. Don't know.
01:27:19 --> 01:27:22 Haven't really talked about that in the news.
01:27:23 --> 01:27:30 But he was given a status, a protected status to stay, which was kind of a unique
01:27:30 --> 01:27:35 status that can be given the people that are in dire situations, right,
01:27:35 --> 01:27:38 that have not started the asylum process.
01:27:39 --> 01:27:45 But nonetheless, he's living his life, and then, of course, Donald Trump gets reelected.
01:27:45 --> 01:27:51 And so now, you know, he got rounded up with some other folks again,
01:27:51 --> 01:27:52 and he's not going to be able to,
01:27:53 --> 01:28:00 And the way the story goes from what I understand is that when they initially
01:28:00 --> 01:28:05 had a group of people to be sent to El Salvador, he was not on that list.
01:28:06 --> 01:28:10 But somehow, some way, he got bumped up.
01:28:11 --> 01:28:17 Either people they thought they had was not in their custody or whatever case
01:28:17 --> 01:28:22 may be, but they ended up finding a spot on the plane for him.
01:28:23 --> 01:28:27 And that's how he ended up in El Salvador. If everything had gone according
01:28:27 --> 01:28:34 to, I guess, whatever their plan was, he would still have been in the tension in the United States.
01:28:35 --> 01:28:41 But because some quirky way space was available, they just threw him on the plane too.
01:28:42 --> 01:28:44 And that's how he ended up there.
01:28:46 --> 01:28:54 And then once they admitted, so I want you to understand, How rare of an occurrence that is.
01:28:55 --> 01:28:59 This administration admitted to making a mistake. Now, they've made a lot of
01:28:59 --> 01:29:02 them, but they will never acknowledge them for the most part.
01:29:02 --> 01:29:06 But this one time, they admitted that they made a mistake.
01:29:06 --> 01:29:11 But being the prideful people they are, they won't correct a mistake.
01:29:13 --> 01:29:16 You know, so, you know, and there could have been cool ways to do it.
01:29:17 --> 01:29:22 President of El Salvador, Bukele, he came to the United States,
01:29:22 --> 01:29:26 said, hey, you know, stick him on a plane and, you know, we'll get him back.
01:29:26 --> 01:29:32 We'll take him, you know, you land in Dulles or Reagan.
01:29:32 --> 01:29:36 You know, we grab him, you know, we take you in the limo and head to the White
01:29:36 --> 01:29:39 House and we'll grab him and put him back in detention or whatever.
01:29:40 --> 01:29:44 You know, at least he's back in the United States. And he's out of the country
01:29:44 --> 01:29:52 that he fled from due to gang violence, right? It could very easily have done that.
01:29:52 --> 01:29:59 But no, they didn't do that. And Bukele is just a younger Latino version of
01:29:59 --> 01:30:05 Trump, you know, arrogant, don't admit to mistakes. You know, you know the type.
01:30:07 --> 01:30:13 And I noticed a reporter that asked Zelensky how come he wasn't wearing a suit.
01:30:13 --> 01:30:20 Didn't ask Bukele the same question, but that's just as trivial as they are.
01:30:20 --> 01:30:24 So it is what it is. But anyway, so Bukele said, well, you know,
01:30:24 --> 01:30:29 he's, you know, who are we to send a terrorist back to the United States?
01:30:29 --> 01:30:33 Well, first of all, you haven't had any trial to prove that he was a terrorist.
01:30:33 --> 01:30:39 So good luck on that as far as any defamation lawsuits go.
01:30:39 --> 01:30:45 But, you know, could very easily say, hey, look, man, I know you're catching some flack.
01:30:45 --> 01:30:47 Well, he can fly back with me and y'all deal with him.
01:30:48 --> 01:30:54 You know, whatever. And it doesn't impact the money that you've given us to
01:30:54 --> 01:30:57 house these people, which is like $6 million.
01:30:57 --> 01:31:01 $6 million American taxpayer dollars, by the way. It's not coming out of Elon Musk's pocket.
01:31:02 --> 01:31:05 It's not coming out of Donald Trump's pocket. It's coming out of the taxpayers.
01:31:05 --> 01:31:14 $6 million taxpayers' money we're paying another country to house people that we round up and deport.
01:31:15 --> 01:31:19 So anyway, could have solved the situation that way.
01:31:19 --> 01:31:26 So we had a U.S. senator go down, Senator Van Hollen, who was senator from Maryland
01:31:26 --> 01:31:29 where Mr. Abrego Garcia was living.
01:31:30 --> 01:31:41 And through his tenacious character, He was able to at least see Mr.
01:31:41 --> 01:31:47 Abrego Garcia, know that he's alive, that he's as well as he can be.
01:31:47 --> 01:31:52 And I guess when the senator comes back as we're taping this,
01:31:53 --> 01:31:57 he's going to let us know what the conversation was about.
01:31:58 --> 01:32:03 What he can. I'm sure he's going to talk to the lawyers to kind of double check
01:32:03 --> 01:32:07 and make sure certain things are said or, excuse me, not said.
01:32:09 --> 01:32:15 So anyway, but from the pictures, we see that, you know, the most important
01:32:15 --> 01:32:16 thing is because, of course, there
01:32:16 --> 01:32:21 were rumors that the young man was unalived, as the young people say.
01:32:21 --> 01:32:29 But he's still with us, and he looks relatively healthy compared to,
01:32:29 --> 01:32:36 you know, some of the conditions that we've heard about that particular prison in El Salvador.
01:32:36 --> 01:32:43 So, again, if you could make the accommodations to do that, you could have worked
01:32:43 --> 01:32:45 out some kind of way with, hey, look,
01:32:46 --> 01:32:52 you know, let him fly back with the senator, you know, and at that point,
01:32:52 --> 01:32:58 again, you can get him and let the senator go back to, you know, his home.
01:32:58 --> 01:33:00 And, you know, you meet Mr.
01:33:00 --> 01:33:08 Abreu Garcia, and you put him in detention here in the United States and let the due process happen.
01:33:08 --> 01:33:11 But didn't do that. So
01:33:11 --> 01:33:22 now this man is in a country that he was trying to get away from in probably
01:33:22 --> 01:33:27 one of the worst prisons where more than likely some of the gang members that
01:33:27 --> 01:33:29 he was trying to get away from are housed.
01:33:32 --> 01:33:38 Now, I assume that they're separating the population from the detainees,
01:33:38 --> 01:33:39 from these gang members.
01:33:39 --> 01:33:44 I don't know, you know, because if they've got people from the Venezuelan gang,
01:33:45 --> 01:33:50 it would be smart to keep them away from an El Salvadoran gang just on GP.
01:33:52 --> 01:33:56 But it is what it is. So, you know.
01:33:58 --> 01:34:06 We have the U.S. Supreme Court now, and now an appellate court is weighed in.
01:34:07 --> 01:34:10 A district court already threw down the gauntlet that is like,
01:34:10 --> 01:34:11 look, you made a mistake.
01:34:13 --> 01:34:18 He is entitled to due process. Bring him back and let him go through the process.
01:34:19 --> 01:34:26 This administration is like, make us, right? They're daring the court to do something.
01:34:27 --> 01:34:33 And the question becomes, can the court do anything other than put them in contempt?
01:34:34 --> 01:34:40 But, you know, it's like, is anybody going to jail for contempt?
01:34:40 --> 01:34:44 Or is anybody going to pay a fine? Or it's just going to be,
01:34:44 --> 01:34:47 well, we held them in contempt and that's all we can do.
01:34:48 --> 01:34:54 And if there's nothing that can be done to force this administration to follow a ruling of a court.
01:34:56 --> 01:34:59 Well, now, ladies and gentlemen, that is a constitutional crisis.
01:35:00 --> 01:35:08 Now, I know people don't want to use that phrase because it implies chaos and anarchy.
01:35:08 --> 01:35:13 And, you know, most people that have envisioned the constitutional crisis as
01:35:13 --> 01:35:16 envisioned like end of the world doomsday scenario.
01:35:17 --> 01:35:22 And Donald Trump is not the first president to challenge the court to make him do something.
01:35:24 --> 01:35:29 However, as volatile as these times are, as dangerous as these times are,
01:35:29 --> 01:35:37 as unpredictable as these times are, I don't think that's the right thing to do.
01:35:38 --> 01:35:44 You know, if it was like, you know, 20 people.
01:35:46 --> 01:35:53 That you've made a mistake on logistically trying to get all that together.
01:35:53 --> 01:35:59 Yeah. We'll take some time, but we're talking about one person and you've had
01:35:59 --> 01:36:06 two opportunities to slide them back in, you know, bring that person back and,
01:36:06 --> 01:36:08 and you didn't take advantage of that.
01:36:09 --> 01:36:15 Other presidents would have done that just to show that they have some sense of compassion.
01:36:16 --> 01:36:22 But somehow, someway, the president has gauged his supporters as having no compassion
01:36:22 --> 01:36:28 and no empathy for this particular case.
01:36:28 --> 01:36:31 So he's just going to be defiant.
01:36:32 --> 01:36:37 And, you know, you got lawyers who obviously don't care if they practice law
01:36:37 --> 01:36:43 again once they're out of the Trump administration, openly defying the court.
01:36:46 --> 01:36:49 You know, and, you know,
01:36:49 --> 01:36:54 you got a secretary of state who is a child of immigrants,
01:36:54 --> 01:37:03 who is acting like that Abrego Garcia was the worst thing to ever happen to
01:37:03 --> 01:37:07 this country and is not doing anything about it.
01:37:07 --> 01:37:14 So, you know, it's just, people talk about humility and, you know,
01:37:14 --> 01:37:20 and try to discount it, you know, as, well, you can't be so humble if you're
01:37:20 --> 01:37:21 going to be in politics and all that.
01:37:21 --> 01:37:26 Actually, yes, you can. It's actually an endearing quality if you're humble,
01:37:26 --> 01:37:30 if you show some humility, if you show some compassion. And.
01:37:31 --> 01:37:36 And the big picture of things, if they can get away with that,
01:37:36 --> 01:37:41 what does that mean for the rest of us? Right.
01:37:42 --> 01:37:51 You know, does that mean we can start deporting Americans who disagree with the president?
01:37:51 --> 01:37:57 Now, he said, oh, no, we're just only going to send the worst people or whatever.
01:38:00 --> 01:38:03 Based on his actions, the worst people in Donald Trump's world are people that
01:38:03 --> 01:38:08 tell him that he's wrong or people that don't agree with him. Right.
01:38:10 --> 01:38:13 So that should worry a lot of folks.
01:38:13 --> 01:38:17 There's at least 76 million people that did not want him to be president.
01:38:17 --> 01:38:19 Is he going to deport all of us?
01:38:20 --> 01:38:25 You know, is he going to try to find countries where he can stick us in so he
01:38:25 --> 01:38:31 can do what he wants to do for the remainder of his term or even be audacious
01:38:31 --> 01:38:35 enough to go for a third term unconstitutionally?
01:38:36 --> 01:38:42 I mean, Abrego Garcia is an individual, but he represents so much more than that.
01:38:42 --> 01:38:46 And I need people to understand that. This is not just an isolated incident.
01:38:47 --> 01:38:54 This is a test case because if this administration can get away with defying
01:38:54 --> 01:39:00 the courts, you know, the last judge that laid into the administration was a
01:39:00 --> 01:39:01 Reagan appointee, for crying out loud.
01:39:02 --> 01:39:09 If he can get away with defying the judicial system, then there's really no holds barred.
01:39:09 --> 01:39:13 It was one thing bad enough for Supreme Court to give him immunity for certain
01:39:13 --> 01:39:15 actions being president.
01:39:16 --> 01:39:23 But for him to just totally disregard the judiciary altogether because they
01:39:23 --> 01:39:27 don't agree with his position,
01:39:27 --> 01:39:31 that they didn't have the legal argument to win them over,
01:39:32 --> 01:39:36 right, then that's dangerous for all of us.
01:39:37 --> 01:39:41 Now, for those of y'all who say, well, you know, Donald Trump ain't going to
01:39:41 --> 01:39:42 do nothing to me. I supported him.
01:39:43 --> 01:39:47 Ask the people who just got fired. Are they immune?
01:39:48 --> 01:39:52 You know, people that had Trump flags in the yards and was working for the federal
01:39:52 --> 01:39:56 government are now in the unemployment line, literally.
01:39:57 --> 01:40:02 So don't don't get this false insecurity because you supported him,
01:40:02 --> 01:40:05 especially those of you who are of color.
01:40:05 --> 01:40:09 He's not going to do anything to you. You are sadly mistaken.
01:40:09 --> 01:40:14 Ask those Venezuelans who danced and sang for President Trump.
01:40:15 --> 01:40:19 Then he took away their temporary protected status, right?
01:40:19 --> 01:40:25 Ask them how they're feeling right now. The Cubans, the Haitians,
01:40:25 --> 01:40:26 ask them how they're feeling.
01:40:27 --> 01:40:32 You know, I'm sure the Ukrainians that are here, I think he's actually done
01:40:32 --> 01:40:34 something taking away their status, right?
01:40:35 --> 01:40:44 Guys, this is really happening. And if your echo chamber is Fox and OAN and
01:40:44 --> 01:40:49 Newsmax and whatever else, Alex Jones, all these people,
01:40:50 --> 01:40:55 Charlie Kirk, Ben Shapiro, some other Benny, Scott Sanders.
01:40:55 --> 01:41:02 I mean, if that's your echo chamber, I get that you're getting a different version
01:41:02 --> 01:41:07 of the story, but I really, really want to implore you.
01:41:07 --> 01:41:11 And if you're not listening to the podcast, those who are listening that know
01:41:11 --> 01:41:16 people like that, please convey to them that there's more to the story than what they're getting.
01:41:18 --> 01:41:27 And, you know, it's just like, you know, the minute that people started showing some empathy for Mr.
01:41:27 --> 01:41:33 Abrego Garcia, then all of a sudden now he's a wife beater, right? A domestic abuser.
01:41:34 --> 01:41:37 So now you're trying to justify kicking him out with, you know,
01:41:37 --> 01:41:43 you could have brought that up in a trial instead of kicking him out and then
01:41:43 --> 01:41:45 saying, oh, yeah, well, he did this. He did that.
01:41:45 --> 01:41:49 See, because when the public starts, you know, as a black person,
01:41:49 --> 01:41:54 when the public starts siding with an unarmed black man that's been killed by
01:41:54 --> 01:42:00 the police, first thing the news tries to do is say, well, you know, here's a mugshot.
01:42:01 --> 01:42:03 This person's been in trouble with the law before.
01:42:04 --> 01:42:10 You know, this person is not a saint. Right.
01:42:11 --> 01:42:18 So human beings are not limited to showing empathy to saints.
01:42:18 --> 01:42:23 There are people who empathize with the Menendez brothers. Right.
01:42:24 --> 01:42:29 We're not we're not limited to showing empathy for saints.
01:42:30 --> 01:42:33 We empathize with people who have been done wrong.
01:42:34 --> 01:42:38 And I don't know what household a lot of them grew up in, but the household
01:42:38 --> 01:42:43 I grew up in. And always try to remind us that two wrongs will make a right, you know.
01:42:45 --> 01:42:50 If he is what y'all say he is, then prove it in a court of law.
01:42:50 --> 01:42:54 Don't just grab him off the street and send him off to some other country and
01:42:54 --> 01:42:55 say, oh, well, you know, ain't nothing we could do.
01:42:56 --> 01:43:02 He had protection by this government and this government, this new administration
01:43:02 --> 01:43:08 just basically ignored that protection, just like they're doing other people.
01:43:10 --> 01:43:16 So, you know, we, as black people, have seen this song and dance before.
01:43:16 --> 01:43:25 And whatever your position is on immigration, do understand that if we are now
01:43:25 --> 01:43:31 in a situation where the president doesn't have to adhere to court decisions,
01:43:31 --> 01:43:33 that's going to be a problem for you.
01:43:34 --> 01:43:40 It's going to be a huge problem for you, especially if you're a member of a
01:43:40 --> 01:43:42 black community that's an activist,
01:43:42 --> 01:43:46 that's out here challenging this administration, that's out here challenging
01:43:46 --> 01:43:53 these policies, that are asking for reforms to things that are supposed to be working for us.
01:43:55 --> 01:44:01 In the meantime, they're cutting Medicaid and they're talking about getting
01:44:01 --> 01:44:05 rid of Head Start and all that stuff. This is the typical playbook stuff, right?
01:44:06 --> 01:44:10 And then, you know, Gavin Newsom, I don't know what he's trying to do.
01:44:11 --> 01:44:13 I guess he's trying to run for president now.
01:44:13 --> 01:44:18 But, you know, he says that this whole Abrego Garcia situation is a distraction.
01:44:19 --> 01:44:23 It might be a distraction for you and what you're trying to achieve.
01:44:23 --> 01:44:27 It may be a distraction for you that you don't want to answer particular questions
01:44:27 --> 01:44:32 considering how big a Latino population you have in your state, Governor.
01:44:33 --> 01:44:39 But it is the very essence of what we were talking about when we said we did
01:44:39 --> 01:44:41 not want this man president again.
01:44:41 --> 01:44:46 This very isolated incident, because it really isn't isolated.
01:44:47 --> 01:44:50 If he can get away with this, Katie Bartador.
01:44:51 --> 01:44:55 He's already proven he's trying to bully the press. He's trying to bully the
01:44:55 --> 01:44:59 Federal Reserve chair, saying he's slow and all this stuff.
01:45:00 --> 01:45:03 And Donald Trump has no concept of what that man is doing.
01:45:03 --> 01:45:11 That man, Chairman Powell, has done as much as any American can during his tenure
01:45:11 --> 01:45:12 to keep everything together.
01:45:14 --> 01:45:18 And no, the financial system is not perfect, neither is judicial system or any
01:45:18 --> 01:45:20 other institution we have in the United States.
01:45:21 --> 01:45:25 But there are some people that are really trying to do the best they can to
01:45:25 --> 01:45:28 do their job and to be fair.
01:45:29 --> 01:45:32 If they come up short, it won't be because of lack of effort.
01:45:34 --> 01:45:40 President, on the other hand, that's not the case. This is deliberately an attempt
01:45:40 --> 01:45:51 to maintain power and to punish anyone or anything that is a threat to his vision of the United States.
01:45:52 --> 01:46:00 And it's ironic that all this stuff is falling into play 250 years after we
01:46:00 --> 01:46:05 told a tyrant that we didn't want to be his subjects anymore.
01:46:07 --> 01:46:15 Literally, 250 years ago, as this podcast is going to air, is basically the
01:46:15 --> 01:46:19 250th anniversary of the Battle of Lexington and Concord.
01:46:19 --> 01:46:23 The first decisive battle for American independence.
01:46:24 --> 01:46:29 And now that very independence is being threatened within. Abraham Lincoln was right.
01:46:30 --> 01:46:33 If this country is going to fall, it's going to fall from within.
01:46:35 --> 01:46:39 And I'm not trying to aid and abet in that fall.
01:46:39 --> 01:46:46 I'm trying to do my best with my voice to preserve the democracy we have so
01:46:46 --> 01:46:49 that we continue to have a chance to make all of our lives better.
01:46:50 --> 01:46:53 Because if we're under any kind of tyrant, whether it's Donald Trump or anybody
01:46:53 --> 01:46:58 else, those chances diminish significantly.
01:46:58 --> 01:47:05 And it's happening in real time. And whatever false pablum that people are getting
01:47:05 --> 01:47:10 in that other echo chamber, pain is going to hit pretty soon.
01:47:10 --> 01:47:15 Because you can't be a tyrant and not subjugate everybody.
01:47:16 --> 01:47:19 You can't just subjugate a few people. You have to subjugate everybody.
01:47:19 --> 01:47:26 So, you know, even during the revolution, there were what we call Tories,
01:47:26 --> 01:47:30 people that still wanted to be under the rule of King George. We get that.
01:47:31 --> 01:47:36 But, you know, history dictated that we would become our own nation,
01:47:36 --> 01:47:42 a symbol that you don't have to have a strong ruler to have a strong government.
01:47:44 --> 01:47:49 So you can be like Gavin Newsom and dismiss it and say that this is a distraction.
01:47:49 --> 01:47:55 You can say, well, you know, that's the Latino's problem. That's not a black problem, whatever.
01:47:55 --> 01:47:58 You can do that, but you do it at your own peril.
01:47:58 --> 01:48:02 If the president of the United States can ignore the U.S.
01:48:02 --> 01:48:07 Supreme Court and control Congress, that's not good for us.
01:48:08 --> 01:48:12 It's not good for us as black people. That's not good for us as Americans.
01:48:14 --> 01:48:21 Period. So pay attention, watch how this thing plays out, and just be ready
01:48:21 --> 01:48:25 to deal with it effectively.
01:48:26 --> 01:48:28 Thank you for listening. Until next time.
01:48:31 --> 01:49:17 Music.