In this episode Erik Fleming celebrates his 61st birthday and leads a powerful conversation covering local and national news, protests, and justice. He interviews Kaylee Jade Peterson, a progressive candidate running for Congress in rural Idaho, about restoring trust in government, public lands, and economic challenges for working-class families. David Eliot, an AI researcher and author, discusses the human origins of AI, its societal impacts, surveillance concerns, and paths forward. The show also reflects on recent shootings, protests, and the urgency for civic action.
00:00:00 --> 00:00:06 Welcome. I'm Erik Fleming, host of A Moment with Erik Fleming, the podcast of our time.
00:00:06 --> 00:00:08 I want to personally thank you for listening to the podcast.
00:00:09 --> 00:00:12 If you like what you're hearing, then I need you to do a few things.
00:00:13 --> 00:00:19 First, I need subscribers. I'm on Patreon at patreon.com slash amomentwitherikfleming.
00:00:19 --> 00:00:24 Your subscription allows an independent podcaster like me the freedom to speak
00:00:24 --> 00:00:27 truth to power, and to expand and improve the show.
00:00:28 --> 00:00:32 Second, leave a five-star review for the podcast on the streaming service you
00:00:32 --> 00:00:35 listen to it. That will help the podcast tremendously.
00:00:36 --> 00:00:41 Third, go to the website, momenterik.com. There you can subscribe to the podcast,
00:00:42 --> 00:00:47 leave reviews and comments, listen to past episodes, and even learn a little bit about your host.
00:00:47 --> 00:00:51 Lastly, don't keep this a secret like it's your own personal guilty pleasure.
00:00:52 --> 00:00:57 Tell someone else about the podcast. Encourage others to listen to the podcast
00:00:57 --> 00:01:02 and share the podcast on your social media platforms, because it is time to
00:01:02 --> 00:01:04 make this moment a movement.
00:01:04 --> 00:01:10 Thanks in advance for supporting the podcast of our time. I hope you enjoy this episode as well.
00:01:15 --> 00:01:20 The following program is hosted by the NBG Podcast Network.
00:02:00 --> 00:02:06 Hello, and welcome to another moment with Erik Fleming. I am your host, Erik Fleming.
00:02:07 --> 00:02:13 So today, as this episode drops, I'll be officially 61 years old,
00:02:13 --> 00:02:17 and it'll be just another day for me.
00:02:18 --> 00:02:25 I do acknowledge the fact that I am 61 years old, and I'm thankful that I've
00:02:25 --> 00:02:27 been able to live this long,
00:02:28 --> 00:02:34 especially as a black man in America, and we'll get into that later on in the
00:02:34 --> 00:02:37 show, along with some other things that happened.
00:02:37 --> 00:02:40 Update on Nekima Armstrong, if you,
00:02:41 --> 00:02:47 Nekima Levy Armstrong, if you haven't been following, she was released along
00:02:47 --> 00:02:52 with the other two people that were arrested that were organizing the protest,
00:02:52 --> 00:02:59 and Grace will mention that in the news summary, but, and they're still out there.
00:02:59 --> 00:03:04 They just had a press conference last week and, you know, we got,
00:03:04 --> 00:03:07 we got all sorts of things coming.
00:03:08 --> 00:03:10 You also know there was another shooting.
00:03:11 --> 00:03:14 Grace's going to mention that. I'm going to talk about that a little bit.
00:03:15 --> 00:03:19 And Don Lemon and Georgia Fort got arrested.
00:03:22 --> 00:03:28 So, yeah. Yeah, I'm going to get into a lot of that. But before I do,
00:03:29 --> 00:03:30 I'm going to have two guests on.
00:03:31 --> 00:03:37 So I'm very fortunate. Another person that's running for Congress is going to be on.
00:03:38 --> 00:03:43 And I'm going to have another person talking about artificial intelligence coming on.
00:03:46 --> 00:03:52 And this author is coming at AI from a sociologist viewpoint,
00:03:52 --> 00:03:59 and it's a very, very good read, and it turned out to be a great interview, too.
00:03:59 --> 00:04:01 So I hope that you enjoy all that.
00:04:03 --> 00:04:07 And, you know, we're in an interesting time, but like I said,
00:04:07 --> 00:04:09 I'll get into that a little later.
00:04:09 --> 00:04:18 Please support us. You know, Georgia is an independent journalist in Minnesota.
00:04:19 --> 00:04:23 Don Lemon, after he left CNN, he became an independent journalist.
00:04:24 --> 00:04:30 Those, all of us that are doing podcasts, you know, show your love and support.
00:04:30 --> 00:04:34 Listen to us. Follow us on social media.
00:04:34 --> 00:04:40 I don't post as much as some of these other folks, but I do at least post to
00:04:40 --> 00:04:45 let you know who's going to be on the show and, you know, and just updates about
00:04:45 --> 00:04:48 what's going on with the podcast.
00:04:48 --> 00:04:52 But also to just, you know, do what you can to support the podcast.
00:04:52 --> 00:05:01 This is our 13th season and we do the math, right? Par...
00:05:02 --> 00:05:08 Sixth year? Seventh year. It'll be our seventh year in July of doing the podcast.
00:05:10 --> 00:05:14 So support us any way you can. If you want to support this podcast,
00:05:14 --> 00:05:20 go to www.momenterik.com and do that.
00:05:20 --> 00:05:26 But please, please stand by those folks who are trying to tell you the truth
00:05:26 --> 00:05:29 and bring you people that are doing the work.
00:05:30 --> 00:05:35 Because we are inundated by the foolishness every day.
00:05:36 --> 00:05:42 So those of us that are trying to give you hope and give you encouragement and
00:05:42 --> 00:05:47 give you what's really going on, please, please stand by us.
00:05:48 --> 00:05:52 All right, speaking about news, let's go ahead and kick this program off.
00:05:52 --> 00:05:56 And as always, we kick it off with a moment of news with Grace G.
00:06:04 --> 00:06:10 Thanks, Erik, and happy birthday. Federal agents killed Alex Pretti, a 37-year-old U.S.
00:06:10 --> 00:06:15 Citizen in Minneapolis, during a confrontation sparking intense local protests
00:06:15 --> 00:06:18 and condemnation following the second such incident this month.
00:06:19 --> 00:06:24 Gregory Bovino was removed from his high-level U.S. Border Patrol post and replaced
00:06:24 --> 00:06:26 by Border Czar Tom Homan.
00:06:27 --> 00:06:31 Tracy Mergen, an FBI agent investigating the fatal shooting of Renee Good by
00:06:31 --> 00:06:33 a federal officer, resigned.
00:06:33 --> 00:06:38 Three activists, including Nekima Levy Armstrong, were released from detention
00:06:38 --> 00:06:42 after being charged with federal crimes for disrupting a church service to protest
00:06:42 --> 00:06:45 an ICE official's leadership role within the congregation.
00:06:45 --> 00:06:50 A man was arrested after spraying Representative Ilhan Omar with a pungent liquid
00:06:50 --> 00:06:55 during a Minneapolis event where she was speaking against federal immigration policies.
00:06:56 --> 00:07:01 A 28-year-old man was taken into custody on Friday night following an alleged assault on U.S.
00:07:01 --> 00:07:06 Representative Maxwell Frost during a Sundance film festival party in Park City, Utah.
00:07:06 --> 00:07:11 The FBI searched a Fulton County election office in Georgia following President
00:07:11 --> 00:07:14 Trump's claims regarding the 2020 election results.
00:07:15 --> 00:07:19 A Virginia judge has halted an effort by state Democrats to implement a more
00:07:19 --> 00:07:23 favorable congressional redistricting map. The U.S.
00:07:23 --> 00:07:27 Justice Department has joined a lawsuit alleging that the UCLA David Geffen
00:07:27 --> 00:07:31 School of Medicine illegally uses race as a factor in its admissions process.
00:07:32 --> 00:07:36 Relatives of two Trinidadian men killed in a U.S. missile strike near Venezuela
00:07:36 --> 00:07:40 have filed a wrongful death lawsuit claiming the military campaign unlawfully
00:07:40 --> 00:07:42 targeted civilian vessels.
00:07:42 --> 00:07:47 A severe winter storm, along with Arctic cold, paralyzed much of the eastern
00:07:47 --> 00:07:52 United States, killing 62 people, causing massive travel disruptions and leaving
00:07:52 --> 00:07:54 over a million people without power.
00:07:54 --> 00:08:00 And the South Carolina measles outbreak has reached 789 cases reported.
00:08:00 --> 00:08:04 I am Grace Gee, and this has been a Moment of News.
00:08:11 --> 00:08:15 All right. Thank you, Grace, for that moment of news and for the birthday wishes.
00:08:16 --> 00:08:23 That means a lot. Thank you so much. All right. It's time for my guest, Kaylee Jade Peterson.
00:08:24 --> 00:08:29 Kaylee Jade Peterson is a 35-year-old working-class mother of two who was making
00:08:29 --> 00:08:31 her third run for the U.S.
00:08:31 --> 00:08:34 House as a Democrat in Idaho's first congressional district.
00:08:35 --> 00:08:38 She was recruited during her freshman year at community college,
00:08:38 --> 00:08:44 where she double majored in criminal justice and political science to run because
00:08:44 --> 00:08:45 it was nearly impossible to find
00:08:45 --> 00:08:49 candidates in a district that received no financial support or attention.
00:08:50 --> 00:08:55 She grew up with a single mother who worked two to three jobs at a time,
00:08:55 --> 00:08:58 was very politically active as a teenager.
00:08:59 --> 00:09:03 Being an advocate and public speaker for diverse students in Idaho's foster
00:09:03 --> 00:09:10 and school systems, managing local state representative campaigns in 2008 before
00:09:10 --> 00:09:11 life went a different direction.
00:09:12 --> 00:09:18 She met her husband 16 years ago, had her daughter at 21, and became a stay-at-home
00:09:18 --> 00:09:21 mom who worked on the side because they couldn't afford childcare.
00:09:22 --> 00:09:29 She spent eight years at home, had her son, and became a foster mom before deciding
00:09:29 --> 00:09:34 she needed to get involved in public policy again and started attending college at 30 years old.
00:09:35 --> 00:09:39 She has spent the last four years learning how to do rural progressive politics
00:09:39 --> 00:09:44 differently, how to communicate to rural voters successfully without compromising
00:09:44 --> 00:09:48 her progressive policy, putting the focus on working-class families and people,
00:09:49 --> 00:09:54 holding elected officials accountable, and organizing in a rural district that
00:09:54 --> 00:09:58 has been left behind by national progressives for nearly three decades.
00:09:58 --> 00:10:03 Ladies and gentlemen, it is my distinct honor and privilege to have as a guest
00:10:03 --> 00:10:07 on this podcast, Kaylee Jade Peterson.
00:10:18 --> 00:10:23 All right. Kaylee Jade Peterson, how are you doing?
00:10:23 --> 00:10:28 I'm doing all right. I think it's a hard mix, especially right now in American
00:10:28 --> 00:10:33 politics, where you feel kind of this heavy weight of everything that's happening
00:10:33 --> 00:10:38 right now, but then also the joys of getting to work with a really amazing community.
00:10:38 --> 00:10:41 So we tried and ride that balance right now.
00:10:41 --> 00:10:46 Yeah, yeah. You know, I always tell candidates when they come on the show that
00:10:46 --> 00:10:49 I'm doing better than they are because I'm not running.
00:10:50 --> 00:10:54 I've been there, done that. So I know what you're going through and I appreciate
00:10:54 --> 00:10:58 you sharing some time in the campaign to talk to me.
00:10:59 --> 00:11:04 One of the things that I like to do to kind of kick everything off is I do a couple of icebreakers.
00:11:04 --> 00:11:12 So the first icebreaker I want you to participate in is a quote that I want you to respond to.
00:11:13 --> 00:11:18 And the quote is, together we possess a shared stake in improving and streamlining
00:11:18 --> 00:11:22 government, even if it means facing uncomfortable troops.
00:11:22 --> 00:11:28 To pursuit of a more perfect union demands this commitment each and every day.
00:11:28 --> 00:11:29 What does that quote mean?
00:11:30 --> 00:11:37 To me, it's everything that I run on, which is this idea that our country and
00:11:37 --> 00:11:42 our government is not just some abstract political process and a bunch of men in suits.
00:11:42 --> 00:11:48 It is really the supposed ideals that we believe in as a shared nation.
00:11:48 --> 00:11:53 It's the constant striving to fulfill these ideals and these promises.
00:11:53 --> 00:11:58 It is the opportunity of us to innovate the brightest possible future for each
00:11:58 --> 00:12:00 and every person that lives here.
00:12:00 --> 00:12:07 Every child, every student, every working class mom and dad,
00:12:07 --> 00:12:09 every small business owner, every farmer.
00:12:09 --> 00:12:15 I mean, literally the success of every member of the community rides on how
00:12:15 --> 00:12:21 well our government is performing and what our government is doing in order to serve these people.
00:12:22 --> 00:12:26 And so there is this really, really heavy obligation for those in elected office
00:12:26 --> 00:12:33 to try and constantly strive for that level of perfection or at least fulfill those promises.
00:12:33 --> 00:12:37 It's everything that government's supposed to be that I think we've lost sight of.
00:12:38 --> 00:12:43 Yeah. OK. Now, the next icebreaker is going to be what we call 20 questions.
00:12:44 --> 00:12:50 So I need you to give me a number between one and 20. Eleven. All right.
00:12:51 --> 00:12:57 Where do you go to check a fact that you see, hear, or read?
00:12:58 --> 00:13:04 Oh, I'm a former speech and debate kid, so there isn't really one sort of information.
00:13:04 --> 00:13:08 What I'll tend to do is research where the information came from,
00:13:08 --> 00:13:14 and then I'll go to any institutions or organizations that have a basis in this,
00:13:14 --> 00:13:15 so whether it's scientific, academic,
00:13:16 --> 00:13:20 whether it's a governmental agency, and then I'll look and see where there might
00:13:20 --> 00:13:25 be other sources of information on that same topic and try to balance it.
00:13:25 --> 00:13:29 I think that's the best way we can try to find truth right now.
00:13:30 --> 00:13:36 Yeah. All right. So I've been to 36 states in my lifetime, but I've never been to Idaho.
00:13:36 --> 00:13:41 So tell me something about the state you grew up in and want to represent in Congress.
00:13:42 --> 00:13:47 Idaho is this weird, magical place because the state itself is huge.
00:13:48 --> 00:13:53 My district is over 500 miles long. And so it takes about 10 to 12 hours to
00:13:53 --> 00:13:55 get from one side of my district to the other.
00:13:55 --> 00:14:02 And while Idaho seems like this very homogenous state, my district is almost
00:14:02 --> 00:14:04 like four separate states all in one.
00:14:04 --> 00:14:08 So you go to North Idaho, you have the Pacific Northwest, you have these gorgeous
00:14:08 --> 00:14:12 lakes and mountains and the most beautiful places you can imagine.
00:14:12 --> 00:14:16 And then you go a little further south and you have the Palouse and these rolling
00:14:16 --> 00:14:22 waves of yellow grain and purple grain and such incredible agriculture and small
00:14:22 --> 00:14:24 farming families that have been doing it for generations.
00:14:24 --> 00:14:31 You have a college town, University of Idaho, which is one of the most vibrant
00:14:31 --> 00:14:34 community college communities that I've ever been part of.
00:14:34 --> 00:14:37 And then you also have really urban areas.
00:14:37 --> 00:14:41 And then peppered throughout, we have areas that are so rural that they're still
00:14:41 --> 00:14:47 considered frontier. So Idaho is really this magical melting pot where you can
00:14:47 --> 00:14:51 get a little bit of everything depending on what part of the state you're in.
00:14:51 --> 00:14:58 I think what we remember and what we value and try to represent is how kind
00:14:58 --> 00:15:02 Idaho was and how close-knit our communities were.
00:15:02 --> 00:15:05 It was the kind of place where you could never meet a stranger,
00:15:05 --> 00:15:10 whether you were in line at a gas station or walking down the sidewalk in the middle of Boise.
00:15:10 --> 00:15:14 It was the kind of place where you could easily build community and connection.
00:15:15 --> 00:15:18 Yeah. So how did your journey in politics start?
00:15:19 --> 00:15:26 Mine is a unique and definitely non-traditional trajectory into the political process.
00:15:26 --> 00:15:32 I was a very, very political child back before there was Google and websites
00:15:32 --> 00:15:37 that still considered odd talking about the Bush presidency in elementary school.
00:15:37 --> 00:15:42 So I got my start really writing the Clinton administration over the hanging
00:15:42 --> 00:15:46 of Kosovo. And I wrote all of my state senators, I wrote the Clinton administration,
00:15:46 --> 00:15:48 and I got these responses.
00:15:48 --> 00:15:53 And it just kind of pushed me out into being more politically involved.
00:15:54 --> 00:15:58 So I remember the no blood for oil protests that had actually gone out in the
00:15:58 --> 00:16:01 first deployment of Operation Iraqi Freedom.
00:16:01 --> 00:16:06 And I spoke at many of the protests across Idaho, trying to hold the Bush administration
00:16:06 --> 00:16:10 accountable for the lies that they told to get us into that conflict.
00:16:11 --> 00:16:17 And then I was grateful I became a public speaker and advocate by the time I was 12, 13 years old.
00:16:17 --> 00:16:21 I would speak at conferences for diverse youth in our foster and school systems
00:16:21 --> 00:16:25 here, working with law enforcement and social workers, students,
00:16:25 --> 00:16:30 educators on how we could try and legislate a more successful pathway for all
00:16:30 --> 00:16:31 students to be successful.
00:16:31 --> 00:16:37 I ended up managing a campaign when I was 18 years old, and this was in 08.
00:16:37 --> 00:16:42 So it was Obama's first election. That was my first foray into managing a local
00:16:42 --> 00:16:48 campaign and being a part of this much bigger political process of community
00:16:48 --> 00:16:52 and volunteers and door knocking and events on campus.
00:16:53 --> 00:16:56 So for me, then life kind of just took a different direction.
00:16:56 --> 00:16:59 I wasn't able to jump into university at 18.
00:16:59 --> 00:17:03 And so I almost kind of started over, met my husband. We had a family.
00:17:04 --> 00:17:09 I had my daughter when I was 21. We became foster parents. Then we had my second child.
00:17:10 --> 00:17:13 And I became a stay-at-home mom almost out of necessity.
00:17:13 --> 00:17:18 I worked nights and weekends, but I was home with my kids during the day until
00:17:18 --> 00:17:24 the 2016 election. And I think I saw such a drastic shift.
00:17:24 --> 00:17:27 It wasn't just kind of corporate corruption or the normal corruption we were
00:17:27 --> 00:17:29 expecting to see from our federal government.
00:17:30 --> 00:17:34 There was this drastic shift in the dialogue, the way we talked about policy,
00:17:35 --> 00:17:39 name-calling, the vitriol, the way that we looked at somebody with an R or deans
00:17:39 --> 00:17:43 to their name instead of any kind of character or integrity.
00:17:44 --> 00:17:47 And I decided that as soon as my kids were old enough that I'd go back to school
00:17:47 --> 00:17:52 and I would get my degrees and hopefully have some kind of positive impact.
00:17:52 --> 00:17:57 So in 2020, right after the COVID lockdown, I was three years old.
00:17:57 --> 00:17:59 I registered to college.
00:17:59 --> 00:18:03 I double majored in criminal justice and political science.
00:18:03 --> 00:18:08 And it was my freshman year at the College of Western Idaho that I saw a Facebook
00:18:08 --> 00:18:13 post from a local political group I had joined looking for volunteers that were
00:18:13 --> 00:18:15 willing to put their name down on the ballot.
00:18:15 --> 00:18:18 And at that point, I said, that's definitely something I can do.
00:18:18 --> 00:18:22 I'm shy. I'm not worried about having my name out there. I'll do this.
00:18:23 --> 00:18:30 But ultimately, it became this opportunity to build something that Idaho was desperately needing.
00:18:30 --> 00:18:33 There's no money in my district. There's no support in my district.
00:18:34 --> 00:18:36 The DNC doesn't invest in my district.
00:18:37 --> 00:18:42 There hadn't been any real progressive politics in the state of Idaho in nearly two decades.
00:18:42 --> 00:18:46 And I fell in love with the work that I get to do, the communities that I was
00:18:46 --> 00:18:51 building with, the contacts I was making, getting volunteers mobilized,
00:18:51 --> 00:18:52 getting neighbors connected.
00:18:53 --> 00:18:56 And I realized that there was huge opportunity that I love the work and that
00:18:56 --> 00:19:01 I was going to continue doing until we could right the wrongs of the current
00:19:01 --> 00:19:02 leadership in the state.
00:19:02 --> 00:19:06 So, yeah, it's been a long time coming, but also kind of came out of nowhere at the same time.
00:19:08 --> 00:19:14 Well, it sounds like that you were an activist and then, you know,
00:19:14 --> 00:19:19 those kindles, those that fire just got rekindled based on what you were seeing.
00:19:19 --> 00:19:22 So I don't think that's an unusual path to politics at all.
00:19:22 --> 00:19:27 I think that's commendable that you decided to get back in there.
00:19:28 --> 00:19:34 And to take it to another level. Why is restoring trust your top priority as a candidate?
00:19:35 --> 00:19:42 I think it is the distrust that voters across the country have in the government
00:19:42 --> 00:19:47 that has led to the kind of hyper-partisan rhetoric manipulation,
00:19:47 --> 00:19:54 specifically of rural communities that I see happening from media and social media networks.
00:19:54 --> 00:19:58 And it feels like because the government
00:19:58 --> 00:20:02 had corruption because career politicians
00:20:02 --> 00:20:05 and elected officials didn't truly value their place
00:20:05 --> 00:20:09 and and did were known for lying you
00:20:09 --> 00:20:13 know everybody knew that politicians were taking money from places they should
00:20:13 --> 00:20:17 never were voting for things that didn't work for their state and that distrust
00:20:17 --> 00:20:25 bred this vulnerability to you know ellison and and all of these right-wing
00:20:25 --> 00:20:28 kind of media network to shift the narrative.
00:20:28 --> 00:20:33 And so for me, if we want to fix this hyper-partisan divide,
00:20:33 --> 00:20:39 if we want to fix the ability to communicate with these communities about what needs to happen,
00:20:40 --> 00:20:43 what would actually help, that if I have a D next to my name,
00:20:43 --> 00:20:46 I'm not the enemy, that I'm not trying to destroy their way of life.
00:20:47 --> 00:20:53 Then I think kind of dismantling a system that didn't work for anybody and replacing
00:20:53 --> 00:21:00 it with this transparency and integrity and the idea of leadership that we all have,
00:21:00 --> 00:21:05 where elected officials were a part of the community, that they represented the community,
00:21:05 --> 00:21:08 that they knew the families, businesses,
00:21:08 --> 00:21:10 and industries and issues, and
00:21:10 --> 00:21:13 they were present whenever they weren't in office, that they were present.
00:21:13 --> 00:21:19 And I think returning that will help us overcome that R versus D,
00:21:19 --> 00:21:25 that MAGA versus progressives, that trying to overcome that huge divide that
00:21:25 --> 00:21:28 is hurting rural communities and rural districts now.
00:21:28 --> 00:21:31 And I think that there's a lot of opportunity in restoring that trust.
00:21:32 --> 00:21:41 Yeah, and it's very important because I think when somebody mentioned the word,
00:21:41 --> 00:21:48 when I was talking to them about authenticity, and I think that's what people really want.
00:21:48 --> 00:21:54 I don't think people are so, the majority of Americans want,
00:21:54 --> 00:22:00 you know, we've got about, I don't know, maybe a third of the nation is Democrat,
00:22:00 --> 00:22:02 a third of the nation is Republican.
00:22:02 --> 00:22:05 And it all varies when we start breaking it down in districts.
00:22:05 --> 00:22:11 But I think the overwhelming majority of people, regardless of what camp they
00:22:11 --> 00:22:14 lie in, they want somebody that's genuine and authentic.
00:22:16 --> 00:22:20 And that's the most important thing to convey.
00:22:21 --> 00:22:27 And they also want to elect somebody that seems like they're enthused about doing the job.
00:22:27 --> 00:22:32 And just in the brief conversation we've had so far, I think you're enthused.
00:22:32 --> 00:22:37 I think you are a person who really wants to do this. So I think that's going
00:22:37 --> 00:22:38 to carry a long way for you.
00:22:39 --> 00:22:44 All right, let's get into some issues. Idahoans concerned about the Epstein files.
00:22:44 --> 00:22:49 And if not, what are they concerned about and how will you go about addressing those concerns?
00:22:50 --> 00:22:55 The Epstein Files might be the first bipartisan issue that we have in the state of Idaho.
00:22:55 --> 00:23:00 It is really the one place where I don't care how conservative or progressive
00:23:00 --> 00:23:07 you are, we all understood that this was the epitome of corruptive power and
00:23:07 --> 00:23:10 the abuse of the most vulnerable among us.
00:23:10 --> 00:23:15 And that's what the Epstein Files truly represents is our ability to hold those
00:23:15 --> 00:23:19 at the top levels of either government or business accountable.
00:23:19 --> 00:23:25 We're all sick and tired of watching those at the top have little to no consequence,
00:23:25 --> 00:23:28 no matter how horrific the things that they've done is.
00:23:28 --> 00:23:30 And the Epstein files is that case.
00:23:31 --> 00:23:36 And it's been really, really frustrating for me specifically because I'm running
00:23:36 --> 00:23:40 against somebody who's incredibly close to the Trump administration.
00:23:40 --> 00:23:41 He's close with the family.
00:23:42 --> 00:23:45 They actually just started business and enterprises together.
00:23:46 --> 00:23:50 So when he's in the state of Idaho, when he's talking on the radio or in the
00:23:50 --> 00:23:54 news, he's 100 percent supportive of releasing the Epstein files.
00:23:54 --> 00:23:59 But he refuses to do anything that would actually cause the release.
00:23:59 --> 00:24:00 He refused the scientific.
00:24:00 --> 00:24:04 He refuses to hold the oversight committee responsible for the fact that we're
00:24:04 --> 00:24:08 over a month out now from when the files were supposed to be released.
00:24:08 --> 00:24:13 So for me, it's talking about this isn't just some big political headline.
00:24:13 --> 00:24:19 This is the abuse of the most vulnerable in our community. This is the abuse of children.
00:24:20 --> 00:24:25 And I think in conservative circles, they use the term save the children so
00:24:25 --> 00:24:32 often that the Epstein files become an opportunity for us to come together and agree on this.
00:24:32 --> 00:24:34 And it's something we're trying to push.
00:24:35 --> 00:24:39 But at the same time, it perfectly highlights the level of corruption.
00:24:39 --> 00:24:43 No matter what somebody says, what they do is often totally different when they're
00:24:43 --> 00:24:48 in elected office. and trying to hold them accountable, especially Epstein,
00:24:48 --> 00:24:51 and every single name in that file.
00:24:52 --> 00:24:55 I think it's horrific. You saw the testimony of Sasha Riley,
00:24:56 --> 00:25:01 and he's naming people like Jim Jordan and Andy Biggs who are sitting in these
00:25:01 --> 00:25:03 kind of oversight committees.
00:25:03 --> 00:25:09 So this is far-reaching, and every single person that I've met who's not in
00:25:09 --> 00:25:14 the files wants to see the text released immediately and wants to see a real
00:25:14 --> 00:25:18 hammer of justice brought down on those culpable in it.
00:25:19 --> 00:25:25 Yeah. So what's a what's another big issue that that you're that you want to
00:25:25 --> 00:25:28 address once you get elected to Congress?
00:25:28 --> 00:25:34 It's interesting. In Idaho, I think one of the there's a few top issues.
00:25:35 --> 00:25:40 Idaho, because we have such a large majority of federally owned lands.
00:25:40 --> 00:25:44 So we are a very big wilderness state. The last major name in the state was
00:25:44 --> 00:25:50 Frank Church, and he was such an incredible pioneer for protecting environmental issues.
00:25:50 --> 00:25:53 We actually have the Frank Church Wilderness Area.
00:25:53 --> 00:25:58 So the people who move to our state are often people who love their sportsmen.
00:25:58 --> 00:26:00 They love hiking and camping.
00:26:01 --> 00:26:05 They love being able to take family out on the weekends and spend it in nature.
00:26:05 --> 00:26:07 They love rafting the rivers.
00:26:07 --> 00:26:11 We are a huge state amount of business that's done alongside our rivers.
00:26:12 --> 00:26:18 And public lands then becomes one of the other few issues that the vast majority of Idahoans agree on.
00:26:18 --> 00:26:24 So 98% of Idahoans agree that public lands is essential to the quality of life in our state.
00:26:24 --> 00:26:28 And right now, they are 100% under threat.
00:26:28 --> 00:26:32 Obviously, we have the new BLM director who's coming in, who has been a huge
00:26:32 --> 00:26:34 proponent of selling off lots.
00:26:34 --> 00:26:40 They had the keep public lands in public hands bill, which my opponent was actually
00:26:40 --> 00:26:44 the only member of our congressional delegation, even though they're all Republican.
00:26:44 --> 00:26:50 My opponent was the only one who voted against keeping public lands in public hands.
00:26:50 --> 00:26:56 So it's not just a quality of life issue because how clean our water and our
00:26:56 --> 00:27:01 soil is and the four or five tribes that operate in my district off of the land,
00:27:01 --> 00:27:07 but also just how essential it is to how we view our way of life here and that
00:27:07 --> 00:27:09 it is under such threat right now.
00:27:10 --> 00:27:14 Not even to mention the huge cuts that were made to the Forest Service.
00:27:14 --> 00:27:17 I work really closely with the Federal Employees Union.
00:27:17 --> 00:27:23 They were gutted earlier last year. And they actually laid off most of the people
00:27:23 --> 00:27:27 in the Forest Service on Valentine's Day last year.
00:27:27 --> 00:27:32 We put people now left to try and manage these lands on a shoestring budget
00:27:32 --> 00:27:34 without staff that they need in really difficult places.
00:27:35 --> 00:27:40 So public lands is huge. But then day to day, we are a working class state.
00:27:40 --> 00:27:44 The average income in my district is $37 a year.
00:27:45 --> 00:27:49 And rural health care is suffering. Rural education is suffering.
00:27:49 --> 00:27:53 Our wages are stagnating while the cost of housing and rent explodes.
00:27:53 --> 00:27:59 So for me, I think one of my biggest priorities is just the economic well-being
00:27:59 --> 00:28:04 of lower working class families in the state and how they are able to thrive
00:28:04 --> 00:28:10 and compete in a market where we've kind of sold them out to monopolized interests,
00:28:11 --> 00:28:15 to corporate interests, to big ag that they cannot compete against,
00:28:15 --> 00:28:18 that exploits their labor and their time.
00:28:18 --> 00:28:22 And then their love just trying to pick up the pieces and take care of the family.
00:28:22 --> 00:28:27 And we are an incredibly hardworking team. And we take a lot of pride in our
00:28:27 --> 00:28:29 work ethic and the grind.
00:28:29 --> 00:28:34 And so to see these incredible people who work so hard to give back to our communities
00:28:34 --> 00:28:39 then suffer just trying to have the basic essentials is really difficult and
00:28:39 --> 00:28:40 definitely a priority for me.
00:28:41 --> 00:28:48 Yeah. All right. So would you vote for legislation that will abolish ICE?
00:28:49 --> 00:28:58 Yes, 100%. I want to dismantle this entire system, this DHS,
00:28:58 --> 00:29:00 the Immigration Services.
00:29:00 --> 00:29:07 I mean, we need to completely reimagine the way that our government functions.
00:29:07 --> 00:29:12 One, because incarceration is the most expensive and least effective tool that
00:29:12 --> 00:29:15 we have in the country to address social issues.
00:29:15 --> 00:29:18 We know that it's failed. It's been failing for over five decades.
00:29:19 --> 00:29:22 And there are only a handful of people who can benefit from ICE,
00:29:23 --> 00:29:27 from DHS, and from the way that our government's operated. But also the lack of trust.
00:29:28 --> 00:29:33 They have slaughtered American citizens. They are snatching American children,
00:29:33 --> 00:29:36 whether they are black or brown, white.
00:29:36 --> 00:29:39 They are taking our children, our neighbors, our co-workers,
00:29:39 --> 00:29:45 valued members of our communities in indefinite detention into these camps that
00:29:45 --> 00:29:51 are obviously privately owned and privately profiting off of this kind of harm to our communities.
00:29:51 --> 00:29:53 And there's no ringing that bell.
00:29:53 --> 00:29:58 You cannot walk back and say, OK, well, we've addressed these problems and we've
00:29:58 --> 00:30:02 retrained and now this is a agency to be operating.
00:30:03 --> 00:30:05 That's just not how this is going to work going forward.
00:30:06 --> 00:30:11 So 100%, I'm all for abolishing ICE. But I think it's also important we communicate
00:30:11 --> 00:30:13 that doesn't mean just dismantling a system.
00:30:14 --> 00:30:19 That means reimagining the way that we get to do it. We get to build something
00:30:19 --> 00:30:21 back better than it has ever been before.
00:30:22 --> 00:30:27 And that's where I think that common ground lies, where why don't we streamline immigration?
00:30:27 --> 00:30:32 Why don't we reimagine and innovate an immigration system that works for migrants,
00:30:33 --> 00:30:38 that works for those seeking asylum, that works for local communities, law enforcement?
00:30:38 --> 00:30:45 That's where I really want to spend our time and our funding and leave this
00:30:45 --> 00:30:47 incredibly dark chapter behind.
00:30:47 --> 00:30:52 There's no excuses that can be made. There's no apologies that can be had.
00:30:52 --> 00:30:58 They have murdered at least two American citizens, if not the 38 people that
00:30:58 --> 00:31:00 have died in ICE custody since this began.
00:31:01 --> 00:31:05 And there has to be accountability there. And I'm not even just talking about
00:31:05 --> 00:31:07 dismantling and abolishing ICE.
00:31:07 --> 00:31:12 I am talking about bringing justice, sexual criminal justice,
00:31:12 --> 00:31:16 for those who participated in these murders.
00:31:17 --> 00:31:22 Yeah. Yeah. And I agree totally on that last point for sure.
00:31:22 --> 00:31:31 You know, when I ran in 2008, in 2006, even, you know, I made the argument that,
00:31:31 --> 00:31:36 you know, we've had immigrants coming into this nation.
00:31:37 --> 00:31:42 And, you know, when we had Ellis Island and all that stuff, and then even the
00:31:42 --> 00:31:44 movement of people within this nation,
00:31:44 --> 00:31:50 we didn't have all this technology, but yet we were allowing people to come
00:31:50 --> 00:31:54 in and make contributions eventually to the growth of this nation.
00:31:55 --> 00:32:01 And I'm just amazed that now we are so technologically advanced that we can't
00:32:01 --> 00:32:04 seem to have a simple solution for immigration.
00:32:04 --> 00:32:09 So I'm with you on that in totally dismantling,
00:32:10 --> 00:32:15 Did you support the Democrats' position during the government shutdown the first time?
00:32:16 --> 00:32:22 You know, I would have been able to find a way to support it if they had actually
00:32:22 --> 00:32:27 accomplished affordable and accessible health care on the other side.
00:32:27 --> 00:32:33 But right now, with not only health care, but what's happening with ICE,
00:32:33 --> 00:32:38 I think every single one of us is saying, where are these supposed leaders?
00:32:38 --> 00:32:43 Like, where are these Democrats that are in positions of power and authority
00:32:43 --> 00:32:48 that can truly lead a movement to offering things like health care and protection
00:32:48 --> 00:32:53 from an out-of-control executive base mercenary army?
00:32:53 --> 00:32:59 And so when the shutdown happened, we were all trying to hold tight.
00:32:59 --> 00:33:03 I was so proud of our Association of Federal Government Employees,
00:33:04 --> 00:33:06 which oversaw our TSA members.
00:33:06 --> 00:33:10 They stopped during that government shutdown to make sure that these federal
00:33:10 --> 00:33:15 employees had house goods and toiletries and hygiene and food.
00:33:15 --> 00:33:20 And I saw this brotherhood of our unions step up in this community there.
00:33:20 --> 00:33:23 So we were all saying, okay, we will hold the line.
00:33:24 --> 00:33:26 We believe in this. We will make this work.
00:33:27 --> 00:33:31 So then to turn their back on these federal employees who sacrificed so much
00:33:31 --> 00:33:34 and the harm that came from these communities and not be able to deliver the
00:33:34 --> 00:33:39 one thing that they were standing for, it's so frustrating.
00:33:39 --> 00:33:44 And I get angry because I look at what my opponent was saying during that government shutdown.
00:33:44 --> 00:33:49 The entire alt-right had been spending this 30 days saying that, whatever.
00:33:49 --> 00:33:53 Democrats wanted to give $1.5 trillion, and they used the word illegal.
00:33:54 --> 00:33:55 We would say documented immigrants.
00:33:55 --> 00:34:01 They tried to make it look like the ACA subsidies and our Medicaid was all going
00:34:01 --> 00:34:05 to the undocumented criminals, the way that they tried to define this community
00:34:05 --> 00:34:06 that doesn't exist, right?
00:34:06 --> 00:34:10 So for me it was hard to support
00:34:10 --> 00:34:13 the democrats when they weren't up there trying to
00:34:13 --> 00:34:17 fight away this completely false narrative it's
00:34:17 --> 00:34:20 hard to excuse their absence right
00:34:20 --> 00:34:25 now and you can't justify the shutdown considering they gave in and didn't get
00:34:25 --> 00:34:29 anything accomplished for the people so at the time i was trying to be supportive
00:34:29 --> 00:34:37 but look back there are no excuses yeah so your Your thing is I'm down with
00:34:37 --> 00:34:39 the fight, but we got to win the fight.
00:34:39 --> 00:34:47 We can't just, you know, say, okay, it's like enough is enough and we'll play ball again.
00:34:48 --> 00:34:52 Your thing is if we're going to do this, let's get some results that really help the people.
00:34:54 --> 00:35:00 Exactly. Especially, I mean, the Democrats in Congress weren't suffering during the shutdown, right?
00:35:00 --> 00:35:05 It was my working class constituents and the people serving my communities that
00:35:05 --> 00:35:08 felt, you know, this down.
00:35:09 --> 00:35:14 So I'm trying to understand the movements. I'm trying to understand logic.
00:35:14 --> 00:35:16 I'm trying to look at the bigger picture and think, OK, well,
00:35:17 --> 00:35:20 if they are trying to achieve affordable and accessible health care for these
00:35:20 --> 00:35:23 same families, then I will stand in this line with them.
00:35:24 --> 00:35:26 But that to give up was just unexcusable.
00:35:26 --> 00:35:31 And I think it's the problem that the Democrats have seen really for the last
00:35:31 --> 00:35:34 few years is where is the leadership?
00:35:34 --> 00:35:36 Where is the fight? Where is the strategy?
00:35:37 --> 00:35:41 Because the right is incredibly well organized. No matter how much they dislike
00:35:41 --> 00:35:48 each other, no matter how different their outcomes are or their values are, they are all united.
00:35:48 --> 00:35:52 They are pushing the same message. They are using the same strategies.
00:35:52 --> 00:35:56 They are all coordinated, especially in states like Idaho.
00:35:57 --> 00:36:00 And so for the left and for the Democratic Party, it's like,
00:36:00 --> 00:36:02 where is the strategy? Where is this fight?
00:36:03 --> 00:36:08 And I'd like to see them step up in a way where it feels like somebody or adults
00:36:08 --> 00:36:14 in the house trying to do the right thing back in D.C., but we have yet to see that, really.
00:36:15 --> 00:36:19 Yeah. All right. So this is going to be kind of a rapid fire deal. but.
00:36:20 --> 00:36:26 Would you support, well, do you support the current situation that we have with
00:36:26 --> 00:36:32 Venezuela that I guess now the president has declared himself in charge of Venezuela?
00:36:33 --> 00:36:37 So I guess it's a territory. Do you support that position at all?
00:36:39 --> 00:36:43 No, no. I mean, listen, I should note here's from a note.
00:36:43 --> 00:36:47 I think he was horrific. But the idea that we are bombing civilians in the capital
00:36:47 --> 00:36:51 city, the fact that we have not legitimized the opposition party,
00:36:51 --> 00:36:53 which did democratically win the last election,
00:36:54 --> 00:37:01 this seems like just a land grab, oil execs at Exxon, and incredibly reckless.
00:37:01 --> 00:37:05 It's definitely not America first. No, definitely not.
00:37:05 --> 00:37:09 Okay. Do you support military aid to Ukraine? No.
00:37:09 --> 00:37:16 I do. I think being one, Russia is a huge threat to our national security,
00:37:17 --> 00:37:20 and that we are not addressing it as if they are really agrees me.
00:37:21 --> 00:37:25 Ukraine is essential. The elements, the mineral rights, and for Russia to get
00:37:25 --> 00:37:30 those becomes a huge issue, not just for America, but for the global community and stability.
00:37:31 --> 00:37:34 So providing aid to Ukraine isn't just the right thing.
00:37:34 --> 00:37:38 It isn't just It's important for us to be there for our allies to push back
00:37:38 --> 00:37:43 against a threat like Putin, but it's also to protect these elements that Russia
00:37:43 --> 00:37:44 would be able to do a lot of harm with.
00:37:45 --> 00:37:47 All right. Do you support military aid to Israel?
00:37:48 --> 00:37:50 No, not a cent.
00:37:51 --> 00:37:56 Netanyahu and this government is committing absolute atrocities.
00:37:56 --> 00:38:00 And I think our position in the global stage or at least position that we had
00:38:00 --> 00:38:07 six months ago before the Trump administration destroyed any trust we had on the national stage,
00:38:07 --> 00:38:13 we are supposed to be the ones stepping up and drawing line against these war crimes.
00:38:14 --> 00:38:17 There's no two ways about it.
00:38:17 --> 00:38:21 I mean, this is a genocide and atrocities, and we should be standing up and
00:38:21 --> 00:38:23 protecting those most vulnerable.
00:38:23 --> 00:38:26 But it looks like the Trump administration just wants to profit off the land,
00:38:27 --> 00:38:28 and that's inexplicable.
00:38:29 --> 00:38:34 Yeah. I don't know if you saw the, that was introduced over in Davos,
00:38:34 --> 00:38:41 the conceptualizations of Gaza, the new Gaza city, whatever,
00:38:41 --> 00:38:44 with the high rises and all that stuff.
00:38:44 --> 00:38:48 You know, it was like, you know, one reporter dared ask the questions,
00:38:48 --> 00:38:50 where are the Palestinians in this conversation?
00:38:51 --> 00:38:55 And nobody, nobody went to the mic to address that. So that's,
00:38:55 --> 00:38:58 that's really, really a crazy thing that's happening there.
00:38:58 --> 00:39:04 All right. Taking into account that less than 1% of your population in the state is African-American.
00:39:05 --> 00:39:11 What is your position on reparations? Yeah, I think we're actually in the top
00:39:11 --> 00:39:13 five biggest states in the nation.
00:39:13 --> 00:39:18 And I think that's actually caused a lot of issues and allowed a lot of the
00:39:18 --> 00:39:19 rhetoric and the misinformation,
00:39:19 --> 00:39:24 the manipulation of information for all communities that don't have firsthand
00:39:24 --> 00:39:29 experience with the Black community and with people of color.
00:39:30 --> 00:39:39 I think that we have to acknowledge that the last 50 years, systemic oppression has created a huge rift.
00:39:39 --> 00:39:43 I mean, we know this. The data is there. The research is there.
00:39:43 --> 00:39:47 I don't know exactly how reparations should look.
00:39:47 --> 00:39:51 I don't know exactly how the federal government can create programs to try and
00:39:51 --> 00:39:56 address that inequity and that oppression that's happening, but I know that
00:39:56 --> 00:39:59 it does need to be addressed and it does need to happen.
00:39:59 --> 00:40:04 And I think the pendulum swung kind of in the right direction under the Obama
00:40:04 --> 00:40:08 administration, but then it felt like there was such a violent pushback that
00:40:08 --> 00:40:11 we got so much further away from it than we were 10 years ago.
00:40:11 --> 00:40:17 So I would like to see an administration where we have the representation there,
00:40:17 --> 00:40:24 that we have the community there trying to devise programs that do address this
00:40:24 --> 00:40:27 inequality and oppression that has been systemic for so long.
00:40:27 --> 00:40:32 But right now it's devastating and we're not even allowed to address the inequality
00:40:32 --> 00:40:35 in our criminal justice system, which is so black and white,
00:40:36 --> 00:40:39 so obvious when you look at these statistics and these numbers.
00:40:39 --> 00:40:46 So I 100% want to address it and I will support it. I just don't know exactly what that looks like.
00:40:46 --> 00:40:50 And usually I'm such data nerds. Usually I'll have the policy or the solution.
00:40:50 --> 00:40:53 And that's one issue that I don't think I have the solution yet,
00:40:53 --> 00:40:57 but I want to support the right solution in the end. Yeah.
00:40:58 --> 00:40:59 And I appreciate that. So.
00:41:00 --> 00:41:04 There's a bill that always comes up. I think the latest number was H.R.
00:41:04 --> 00:41:08 40 that asked for a study on the issue.
00:41:09 --> 00:41:12 Would you, so you would support that bill if you got elected?
00:41:13 --> 00:41:16 Oh, a hundred percent. I would go even further.
00:41:16 --> 00:41:20 I would love to create a department that is looking at addressing these issues
00:41:20 --> 00:41:23 with our Black community, even with our tribal communities.
00:41:24 --> 00:41:27 I mean, we have to look at how do we make up the ground.
00:41:27 --> 00:41:33 And I mean, history is how I feel in love with politics and government in the first case.
00:41:33 --> 00:41:39 And so I get incredibly frustrated that those in elected C aren't familiar with
00:41:39 --> 00:41:43 the history or at the very least are intentionally blind to it.
00:41:43 --> 00:41:47 And so whether it's creating a committee, whether it's creating a department,
00:41:47 --> 00:41:52 I would support any and all efforts to create a study or to find the right solution.
00:41:53 --> 00:41:59 Since the circuits of the U.S. Court of Appeals are assigned a circuit justice from the U.S.
00:41:59 --> 00:42:04 Supreme Court and there are 13 Court of Appeal circuits, would you vote for
00:42:04 --> 00:42:08 an expansion of the U.S. Supreme Court to 13 justices?
00:42:09 --> 00:42:16 I want to say yes to this because we have got to address the issues within the Supreme Court.
00:42:16 --> 00:42:19 The Supreme Court has been compromised.
00:42:19 --> 00:42:23 There are no ifs, ands, or buts about it. We watched this happen.
00:42:23 --> 00:42:27 So we have to address it. The expansion is one way of doing that.
00:42:27 --> 00:42:30 I don't know if there are other ways. I know that other people are bringing
00:42:30 --> 00:42:35 up term limits or trying to cut that down. I don't know if that creates a whole
00:42:35 --> 00:42:40 other issue where we politicize these benches even more than it has already been.
00:42:40 --> 00:42:46 But yes, expansion has been the way that I have been looking at trying to fix it.
00:42:46 --> 00:42:50 So that's that's my support is right now. But it is something I am trying to
00:42:50 --> 00:42:52 look at the potential solution.
00:42:52 --> 00:42:55 But it's a priority that we have to address immediately.
00:42:55 --> 00:43:02 All right. Idaho has not voted for the Democratic presidential nominee since 1964.
00:43:03 --> 00:43:07 Since 1990, only two Democrats have represented the 1st Congressional District
00:43:07 --> 00:43:09 for a total of six years of service.
00:43:09 --> 00:43:14 The last Democratic woman to represent the 1st District left office in 1963.
00:43:15 --> 00:43:23 Last time you ran in 2022, you received 27% of the vote. Why is 2026 going to be different?
00:43:24 --> 00:43:27 This is where I knew my promise meant something.
00:43:28 --> 00:43:34 This is, I knew immediately in 2022, because in 2024, I never once said we could win the election.
00:43:34 --> 00:43:38 I knew that we needed a candidate who was willing to commit the time and look
00:43:38 --> 00:43:43 at the picture and do long-term campaigning to build out the infrastructure,
00:43:43 --> 00:43:48 the trust, the networking, to oppose the kind of dark money that's coming from
00:43:48 --> 00:43:52 places like the Heritage Foundation and the Freedom Foundation and all of these
00:43:52 --> 00:43:56 special interests that profit off of these far-right kind of interests.
00:43:57 --> 00:44:02 And when I started, there were most communities, I could not find a single progressive
00:44:02 --> 00:44:05 contact. I didn't have any Democratic voters.
00:44:05 --> 00:44:07 There was no local county parties.
00:44:08 --> 00:44:12 Most of these areas, most people considered dangerous to just go in and wave
00:44:12 --> 00:44:14 your hand around saying, I'm a Democrat.
00:44:14 --> 00:44:21 And I think I benefited from the fact that I am a blonde woman who looks like
00:44:21 --> 00:44:23 a conservative Idaho mom.
00:44:23 --> 00:44:27 I think I have also benefited from growing up in these circles.
00:44:27 --> 00:44:31 My family are kind of old school Eisenhower, Reagan Republicans.
00:44:31 --> 00:44:35 My dad was a Trump supporter. I married into a Christian evangelical family.
00:44:35 --> 00:44:40 So I understand what these discussions look like at the dinner table versus
00:44:40 --> 00:44:45 what we understand actually being involved in politics and on the ground.
00:44:45 --> 00:44:50 So I think I was able to try and bridge that gap. And it takes time, right?
00:44:50 --> 00:44:54 I go in and I'm with VFWs and Grange Halls and every local union,
00:44:55 --> 00:44:58 even though 75, 80% of their membership is Republican.
00:44:58 --> 00:45:03 I am going straight into kind of the belly of the beast, as some Democrats might
00:45:03 --> 00:45:06 put it, and showing them that I'm not the enemy. But I'm not the enemy. I'm not the enemy. I'm
00:45:06 --> 00:45:11 More importantly, offering them the solutions Republicans refuse to.
00:45:11 --> 00:45:15 Life has gotten much harder in Idaho, and we have had a Republican super majority
00:45:15 --> 00:45:17 for 30 years. You can't blame the Democrats.
00:45:18 --> 00:45:22 But when there hasn't been a Democrat in over 20 years in these communities
00:45:22 --> 00:45:25 to combat that narrative, then we're not going to get anywhere.
00:45:25 --> 00:45:31 So 22 and 24, it was recruiting local candidates, recruiting local contacts,
00:45:32 --> 00:45:38 volunteers, building trust and relationship with local communities and organizations, expanding that.
00:45:38 --> 00:45:41 So it's not just Kaylee for Congress doing work-based communities,
00:45:41 --> 00:45:47 but now we motivated and connected and given the resources necessary for local
00:45:47 --> 00:45:49 community members to grow and organize.
00:45:50 --> 00:45:55 2026 is a really, really unique opportunity for us. One, because the Republican
00:45:55 --> 00:46:00 Party hasn't delivered on a single promise that they made to the American people.
00:46:00 --> 00:46:06 Wages are still stagnated. We can't afford groceries, gas, public lands are under threat.
00:46:06 --> 00:46:11 Things done so that most Republicans I know are questioning this industry.
00:46:11 --> 00:46:17 But also, I have four years of trust, contact, and volunteers that are so ready
00:46:17 --> 00:46:24 to mobilize. And on top of that, we have reproductive rights on the ballot for the first time in 26.
00:46:25 --> 00:46:30 So you have 75% of the population that are progressive already,
00:46:30 --> 00:46:36 that are women already, that have a really important life-threatening issue
00:46:36 --> 00:46:37 to show up for on Election Day.
00:46:37 --> 00:46:43 And even on top of that, we have a decriminalization of marijuana and medicinal
00:46:43 --> 00:46:47 marijuana on the ballot for the first time, which my veterans,
00:46:47 --> 00:46:51 my affiliated voters, my libertarian voters 100% support.
00:46:51 --> 00:46:53 So we have the momentum.
00:46:53 --> 00:46:59 It is no just Kaylee and Congress out yelling into this giant district trying
00:46:59 --> 00:47:00 to get people to pay attention.
00:47:00 --> 00:47:06 We have built a massive community that is able to do the work to win,
00:47:06 --> 00:47:10 and for the first time I know that we have a path to success.
00:47:11 --> 00:47:16 It'll be a Hail Mary. Don't get me wrong. It's not going to be easy. It's not guaranteed.
00:47:16 --> 00:47:21 But I know exactly where the voters are we need to win.
00:47:21 --> 00:47:26 I know exactly how we reach them to win. And I know about the trust in the community
00:47:26 --> 00:47:28 to mobilize and engage them to do so.
00:47:28 --> 00:47:35 So this is a really, really special election and opportunity for us to surprise everyone. All right.
00:47:35 --> 00:47:40 So I'm asking this question to all of my guests this year.
00:47:40 --> 00:47:44 Finish this sentence. I have hope because.
00:47:44 --> 00:47:52 No matter how bad it gets, no matter how hard life is for everyday families,
00:47:53 --> 00:47:59 no matter how dire and depressing the situation in the news becomes,
00:47:59 --> 00:48:04 the worse it gets, the more I see our communities come together.
00:48:05 --> 00:48:12 The worse it gets, the more I see everyday people step up and do extraordinary things.
00:48:12 --> 00:48:16 And I think when we are put into these incredibly difficult times,
00:48:17 --> 00:48:20 it's when we really truly get the chance to see the best of us.
00:48:20 --> 00:48:24 And that is what I have seen, especially over the last six months,
00:48:24 --> 00:48:30 is I have seen people that don't have resources or experience step up and do
00:48:30 --> 00:48:33 incredible things and organize incredible solutions.
00:48:33 --> 00:48:40 And they gave me hope and I think they make it really easy to work as hard as
00:48:40 --> 00:48:44 we have to work to try and provide some kind of relief and solutions for the people of my state.
00:48:45 --> 00:48:50 All right. So if people want to get involved with the Kaylee Jade Peterson campaign.
00:48:52 --> 00:48:57 To reform and revive Idaho, how can they get involved?
00:48:58 --> 00:49:02 I'm the only Kaylee for Congress that has ever run for Congress in America.
00:49:03 --> 00:49:07 So I'm the only one you'll find online. It's just Kaylee for Congress,
00:49:08 --> 00:49:12 K-A-Y-L-E-E-E, all spelled out. Dot com is my website.
00:49:12 --> 00:49:17 At gmail.com is my email. And it's Kaylee for Congress on all of my social medias.
00:49:18 --> 00:49:23 We're big on TikTok, Facebook and Instagram. We're incredibly consistent there.
00:49:23 --> 00:49:25 We're getting all of our volunteers together.
00:49:25 --> 00:49:29 And we have volunteers from New York to Texas to Florida to California.
00:49:29 --> 00:49:31 We really are an Asia-wide team.
00:49:32 --> 00:49:34 We do a lot of remote virtual work together.
00:49:34 --> 00:49:38 But all those pathways go right to me.
00:49:38 --> 00:49:44 So if somebody had a question, if they want me to come do some work in the community,
00:49:44 --> 00:49:46 I am always acceptable to them.
00:49:46 --> 00:49:49 And I look forward to hearing from people listening today.
00:49:50 --> 00:49:54 Well, Kaylee Jade Peterson, I am really, really honored that you took the time
00:49:54 --> 00:49:56 out of the campaign to do this.
00:49:56 --> 00:49:58 I wish you much success in the campaign.
00:49:59 --> 00:50:03 One of the rules I have is that, you know, once you've been a guest,
00:50:03 --> 00:50:09 you have an open invitation to come back, and it would be really, really sweet if a U.S.
00:50:09 --> 00:50:13 Congresswoman from Idaho would come back on to be a guest on the program.
00:50:13 --> 00:50:18 So thank you so much for doing this, and again, good luck on the campaign.
00:50:18 --> 00:50:22 Thank you so much for the time. And I look forward to seeing you in the end
00:50:22 --> 00:50:28 of January next year when we've been initiated and sworn in.
00:50:28 --> 00:50:31 Thank you so much, Erik. And thank you for the work that you're doing.
00:50:32 --> 00:50:34 All right, guys. And we're going to catch y'all on the other side.
00:50:54 --> 00:50:59 All right, and we are back. And so now it is time for my next guest, David Eliot.
00:51:00 --> 00:51:04 David Eliot is a PhD candidate at the University of Ottawa,
00:51:04 --> 00:51:09 where he researches the social and political effects of artificial intelligence.
00:51:10 --> 00:51:15 He is a member of the Critical Surveillance Studies Lab, and his work on AI
00:51:15 --> 00:51:21 has been recognized with numerous awards, including the 2022 Pierre Elliott
00:51:21 --> 00:51:24 Trudeau Foundation PhD scholarship.
00:51:24 --> 00:51:30 His first book, Artificially Intelligent, The Very Human Story of AI,
00:51:30 --> 00:51:35 was recently published by the University of Toronto Press, and we're going to
00:51:35 --> 00:51:37 be talking about that during the interview.
00:51:37 --> 00:51:42 So, ladies and gentlemen, it is my distinct honor and privilege to have as a
00:51:42 --> 00:51:45 guest on this podcast, David Eliot.
00:51:56 --> 00:52:00 All right. David Eliot, how are you doing, sir? You doing good?
00:52:01 --> 00:52:04 I'm doing great. Thank you so much for having me on today. Well,
00:52:04 --> 00:52:06 I appreciate you coming on.
00:52:06 --> 00:52:12 I just recently had a guest, so I don't know what's going on in Canada with
00:52:12 --> 00:52:16 AI, but I recently, I just had a professor up at the University of Waterloo,
00:52:16 --> 00:52:19 and she had written a book about AI, and I looked and I said,
00:52:19 --> 00:52:21 oh, yeah, I got another guest coming on.
00:52:21 --> 00:52:26 So even though this is a political show, it seems like I'm getting some of the
00:52:26 --> 00:52:29 top minds as far as artificial intelligence goes.
00:52:30 --> 00:52:34 So I really appreciate that. That raises my level, makes people think that I'm intelligent.
00:52:35 --> 00:52:39 Oh, I'm sure you don't need the help of that, but I'm glad I can be in persistence.
00:52:40 --> 00:52:46 Yeah, yeah, yeah, yeah. I appreciate that. So look, I usually start off the
00:52:46 --> 00:52:49 interview with a couple of icebreakers.
00:52:49 --> 00:52:54 So the first icebreaker is a quote that I want you to respond to.
00:52:54 --> 00:53:01 And the quote is, what use is producing knowledge if we cannot effectively share
00:53:01 --> 00:53:03 it with those who need it most?
00:53:04 --> 00:53:07 Yeah. So, I mean, it's a quote from the prologue of the book.
00:53:08 --> 00:53:12 And it was one of the major reasons I wrote this book. It was one of the first lines I actually wrote.
00:53:13 --> 00:53:17 And it came from a frustration I have of academia. And I am an academic.
00:53:17 --> 00:53:21 My grandparents were academics. I was born to this world and I love it.
00:53:22 --> 00:53:25 But I'm also critical of it because I feel like a lot of the time we end up
00:53:25 --> 00:53:30 in these ivory towers producing this knowledge that we don't effectively share
00:53:30 --> 00:53:33 with people. that just becomes a bit of an echo chamber.
00:53:34 --> 00:53:38 And at times we can just be justifying our own existence. That's not everybody,
00:53:38 --> 00:53:41 but it's become more and more common, I think, with academics.
00:53:41 --> 00:53:47 And I see people in the everyday world experiencing problems that in the academy,
00:53:47 --> 00:53:52 we feel like we have the answers for, and then they get upset that people aren't doing things.
00:53:53 --> 00:53:58 And I feel like, well, we aren't reaching out. We aren't producing work in ways that are accessible.
00:53:58 --> 00:54:01 You know, we write peer-reviewed papers. We write journal articles.
00:54:01 --> 00:54:04 We respond to each other, but we just build this echo chamber.
00:54:05 --> 00:54:09 And I really wanted to in this book, but I feel like a lot of great authors
00:54:09 --> 00:54:13 have done, this is not unique to me, was to really kind of try and take the
00:54:13 --> 00:54:17 academic research and present it in ways that it gets to the communities who need it.
00:54:17 --> 00:54:20 And that we meet people where they are. We don't say, oh, well,
00:54:20 --> 00:54:22 you should come to us for our knowledge.
00:54:22 --> 00:54:26 That we should meet people where they are, make it accessible to them.
00:54:26 --> 00:54:32 And beyond everything else, make it enjoyable. because people want to enjoy
00:54:32 --> 00:54:36 learning. They want to enjoy the content they're getting. And not everybody's an academic.
00:54:36 --> 00:54:39 Not everybody was meant to be in this life.
00:54:40 --> 00:54:46 Yeah, that's true about being an academic. So the next iceberg is what we call
00:54:46 --> 00:54:51 20 questions. So I need you to give me a number between 1 and 20.
00:54:51 --> 00:55:00 Go 18. Okay. What's one thing we might all agree is important no matter our differences?
00:55:01 --> 00:55:02 Ooh, that's a good one.
00:55:04 --> 00:55:06 It's one of those things where I feel like there's a lot we agree on that's
00:55:06 --> 00:55:10 important. And when you get asked the question, I kind of blank on it.
00:55:10 --> 00:55:16 I just think central values of humanity in general and trying to be kind to each other.
00:55:17 --> 00:55:19 And I think in practice, what that
00:55:19 --> 00:55:23 takes and promoting human thriving is something I think we all agree on.
00:55:24 --> 00:55:27 I think where we differ is in practice of how we achieve that.
00:55:27 --> 00:55:31 And that's where a lot of division can arise. And I think actually with this,
00:55:32 --> 00:55:36 a great quote from an indigenous leader in Canada I got to spend some time with.
00:55:37 --> 00:55:40 And she's a very famous environmentalist, has done amazing work,
00:55:40 --> 00:55:46 but is constantly speaking to Fortune 500 companies, to oil companies, to all these groups.
00:55:46 --> 00:55:51 And I asked her, how do you work with them when they seem to be so diametrically opposed to you?
00:55:51 --> 00:55:55 And she said, I always need to remember that every human being at their core
00:55:55 --> 00:56:00 has 98% in common. our core values and what drives us is usually the same.
00:56:01 --> 00:56:06 That 2% of how we try and actualize it and how we understand how we do that is where we differ.
00:56:06 --> 00:56:10 But if we focus on that 2%, we'll get nowhere. If we can try and build from
00:56:10 --> 00:56:13 that 98%, we can build something together.
00:56:13 --> 00:56:17 And that's what was sat with me. So I think what we share tends to be core values.
00:56:18 --> 00:56:24 You know, it's interesting because I always made that argument when I was in the legislature.
00:56:24 --> 00:56:31 I used to tell people that, about 98% of the time we all agreed on stuff is
00:56:31 --> 00:56:33 just the 2% that made the news.
00:56:34 --> 00:56:39 And it looked like we hated each other's guts and all that. So that's interesting.
00:56:40 --> 00:56:44 That's a good philosophy to maintain.
00:56:44 --> 00:56:50 Maybe that'll help us here in the United States kind of navigate things a little better.
00:56:50 --> 00:56:56 How does one evolve from a magician to a researcher of artificial intelligence?
00:56:57 --> 00:57:01 Yeah, it was a very interesting and weird path, because I was working in that
00:57:01 --> 00:57:07 industry, having some nice success, touring, and I got a little tired.
00:57:07 --> 00:57:10 The entertainment lifestyle is exhausting, and I wanted a bit more stability.
00:57:11 --> 00:57:16 So I decided to go to university during the winter months and tour during the summer months.
00:57:16 --> 00:57:20 So I was doing my degree for eight months of the year, tour for four months,
00:57:20 --> 00:57:23 and I ended up studying sociology, which I loved.
00:57:24 --> 00:57:26 Thought I was going to continue that research.
00:57:26 --> 00:57:28 And actually, my original research was on American politics.
00:57:29 --> 00:57:33 I was doing the sociology of American politics at the time, specifically as
00:57:33 --> 00:57:36 it related to the rise of Donald Trump.
00:57:37 --> 00:57:41 And in that, we focused a lot on misinformation. So how does misinformation
00:57:41 --> 00:57:44 get produced? How does it get spread? Why is it so difficult to deal with?
00:57:44 --> 00:57:48 And one of the things with it is how easy it is to produce. It's much harder
00:57:48 --> 00:57:51 to produce good journalism than it is just to write, you know,
00:57:51 --> 00:57:52 a falsehood and put it out there.
00:57:53 --> 00:57:57 And then I came across a program which said it could write like a human,
00:57:57 --> 00:58:01 that you could just give it a prompt and it would write like a human. This was 2019. team.
00:58:02 --> 00:58:06 I'm like, well, if that was real, that could be really dangerous for misinformation.
00:58:07 --> 00:58:11 So I started researching this, met with some friends from Silicon Valley,
00:58:11 --> 00:58:14 and they're all like, it's real. It's really crazy. You need to see it.
00:58:14 --> 00:58:18 They directed me towards the company. I got to see some early demos of it.
00:58:18 --> 00:58:22 And I was floored. And I instantly said, this is going to change everything.
00:58:22 --> 00:58:24 And originally I was thinking about misinformation.
00:58:25 --> 00:58:28 But quickly I realized, no, this is going to change everything.
00:58:29 --> 00:58:33 And that program was GPT-2. So the company that I was looking at was OpenAI.
00:58:34 --> 00:58:38 And I just had the same moment that everybody else had when they saw ChatGPT.
00:58:38 --> 00:58:42 And when this is going to change everything, and every academic started doing
00:58:42 --> 00:58:46 AI research at that point, I just got lucky that I got to have that moment three
00:58:46 --> 00:58:50 years earlier, or four years earlier. Yeah, four years earlier.
00:58:51 --> 00:58:54 And then the pandemic hit, I couldn't tour. So I said, well,
00:58:54 --> 00:58:58 I'll just continue of this academic path, researching AI, at that point,
00:58:59 --> 00:59:01 I'm like, I feel like we've got at least 10 or 15 years before this becomes
00:59:01 --> 00:59:04 a really big deal. Boy, was I wrong on the timeline.
00:59:05 --> 00:59:09 And things just transitioned in. I never left this job when the world opened up.
00:59:09 --> 00:59:14 And I'm so happy to be in this field and getting to do work and try and help
00:59:14 --> 00:59:15 people understand this moment we're in.
00:59:17 --> 00:59:21 Why is artificial intelligence a very human story?
00:59:23 --> 00:59:29 So I think it's interesting because we tend to talk about AI as if it's some alien technology.
00:59:29 --> 00:59:34 I actually think in Noah Harari's recent book, he called it like an alien multiple
00:59:34 --> 00:59:38 times, this idea of framing it as this like extraterrestrial thing that's come
00:59:38 --> 00:59:41 in that's so different, that's this other.
00:59:41 --> 00:59:44 But the reality is that ai is a
00:59:44 --> 00:59:47 human technology it's a technology that was made by us
00:59:47 --> 00:59:51 and it's controlled by us
00:59:51 --> 00:59:54 in the story of this book i try and explore
00:59:54 --> 00:59:56 the human foundations of ai so in this we're
00:59:56 --> 00:59:59 looking at the humans who made it how they built it the decisions they
00:59:59 --> 01:00:02 made and i think it teaches us some interesting things
01:00:02 --> 01:00:05 and one is how the behaviors of ai really take
01:00:05 --> 01:00:09 on our behaviors how the grief of
01:00:09 --> 01:00:12 some of its creators how their objectives really shaped this
01:00:12 --> 01:00:16 technology we're dealing with right now that ai it's
01:00:16 --> 01:00:18 directed by us it might feel separate but if
01:00:18 --> 01:00:23 we treat it like that we treat it as something that we need to control as something
01:00:23 --> 01:00:27 that you know is an existential threat instead of realizing that it is an oftentimes
01:00:27 --> 01:00:32 a reflection of us a reflection of our decisions that's why ai in the united
01:00:32 --> 01:00:37 states is so different than ai in europe and why ai in Europe is so different than AI in China.
01:00:38 --> 01:00:42 It builds and reflects the people building it, the cultures building it,
01:00:42 --> 01:00:43 the understandings and the values.
01:00:44 --> 01:00:47 And I think one of the really empowering things about that that I wanted readers
01:00:47 --> 01:00:50 to take out of this book is understanding that the shape it's taking,
01:00:51 --> 01:00:54 the way it's affecting us, is caused by human decisions.
01:00:54 --> 01:00:58 And that there are still decisions left to be made. We as humans get to make
01:00:58 --> 01:01:01 more decisions now that will decide the future of AI,
01:01:01 --> 01:01:05 that will decide how it's implemented into our society, how it's built how it's
01:01:05 --> 01:01:11 designed so we get to make choices right now that will define the ai world that
01:01:11 --> 01:01:15 we get to live in that our grandchildren get to live in and potentially people
01:01:15 --> 01:01:19 for the next 100 200 years of civilization get to live in.
01:01:20 --> 01:01:25 Yeah, I, you know, it was, it was really the way you started the book out.
01:01:25 --> 01:01:32 That was really, really fascinating because, you know, I, there were some terms
01:01:32 --> 01:01:35 like Boolean. I had heard that term before.
01:01:35 --> 01:01:38 I guess I was paying attention to math class when that happened,
01:01:39 --> 01:01:44 but it was like, so to hear the origin story behind that and how we got the
01:01:44 --> 01:01:46 word algorithm and all that stuff.
01:01:46 --> 01:01:52 It was like it was you could tell from the jump how you were reminding us of
01:01:52 --> 01:01:57 how much of an impact humans have had in this.
01:01:57 --> 01:02:03 Because, you know, a lot of, you know, even though in the back of our mind,
01:02:03 --> 01:02:06 a very, very basic understanding is like,
01:02:07 --> 01:02:15 yeah, we humans created AI, but it just, you know, it's another to make a real
01:02:15 --> 01:02:19 historical connection all the way back to the beginning of mathematics.
01:02:19 --> 01:02:23 So I really appreciated how you did that. Well, thanks.
01:02:23 --> 01:02:28 Did you? So when you were giving your answer on the quote,
01:02:28 --> 01:02:36 you were talking about why you felt compelled to share knowledge,
01:02:36 --> 01:02:40 how to take the knowledge from the ivory tower and bring it to the masses.
01:02:40 --> 01:02:43 Was that the same compulsion that led you to write the book?
01:02:45 --> 01:02:51 Yeah, yeah, 100%. Because I really felt like with AI, especially at that time,
01:02:51 --> 01:02:54 when I started the book, you know, we are really starting to kind of develop
01:02:54 --> 01:02:56 the public conversation on AI.
01:02:57 --> 01:03:00 And what I saw was in academia, there was a lot of really good research,
01:03:01 --> 01:03:02 a lot of really good stories.
01:03:02 --> 01:03:05 You know, you can look at the reference list in this, I'm trying to pull from,
01:03:05 --> 01:03:07 you know, this academic body of knowledge.
01:03:08 --> 01:03:10 But then I found with my friends, whenever they would ask me,
01:03:11 --> 01:03:12 you know, what book should I read?
01:03:12 --> 01:03:15 What do I need to do to like learn how to navigate AI in my work?
01:03:15 --> 01:03:18 And you know, my friends come from every background.
01:03:18 --> 01:03:23 You know, I'm friends with marketing researchers, I'm friends with athletes,
01:03:23 --> 01:03:27 I'm friends with people from, you know, unfortunately, I'm friends with some politicians too.
01:03:27 --> 01:03:30 And they will always ask, you know, what do I read? And I'd give them these
01:03:30 --> 01:03:32 readings that I thought were valuable.
01:03:32 --> 01:03:35 And they would never read them. And then I would be like, well,
01:03:35 --> 01:03:36 what are you watching? What are you learning from?
01:03:37 --> 01:03:41 And when I came to realize through the way they talked about it or the things
01:03:41 --> 01:03:45 I saw was that what was accessible tended to be,
01:03:45 --> 01:03:50 you know, these crypto bros turned AI bros who were spewing what was really
01:03:50 --> 01:03:53 like not great advice, in my opinion.
01:03:53 --> 01:03:58 So you kind of saw this really weird thing where I felt like this good knowledge
01:03:58 --> 01:04:00 wasn't available to these people.
01:04:00 --> 01:04:03 So I really wanted to kind of fill that gap. I want to provide,
01:04:03 --> 01:04:07 I'm like, if this book doesn't exist, I want to provide it because right now
01:04:07 --> 01:04:11 it seems like the option is, you know, easily accessible and digestible,
01:04:11 --> 01:04:15 but not good information or very good, but not easily accessible information.
01:04:15 --> 01:04:20 So I kind of felt like I was craving that book so I could suggest it to my friends.
01:04:20 --> 01:04:21 And I'm like, you know what?
01:04:21 --> 01:04:25 If there is a opening here, someone's going to make it. Might as well be me.
01:04:26 --> 01:04:32 Yeah. You stated that it wasn't necessary for you to know how AI works,
01:04:32 --> 01:04:35 but to study its implications.
01:04:35 --> 01:04:37 Why, why that approach?
01:04:38 --> 01:04:42 Yeah. So I think a major thing with this is you, of course, need to know how
01:04:42 --> 01:04:44 it works to a certain extent.
01:04:44 --> 01:04:48 But I think one of the really intimidating things with AI is the mathematics
01:04:48 --> 01:04:54 behind it and understanding how vectors work, understanding how neural networks work.
01:04:54 --> 01:04:58 And that nitty gritty is very important when you're trying to understand specific
01:04:58 --> 01:05:01 implications and the specific way it involves.
01:05:02 --> 01:05:05 But in the general, I think you only need
01:05:05 --> 01:05:07 to know a certain amount like there's a base level of knowledge you
01:05:07 --> 01:05:10 need to join the conversation and to be
01:05:10 --> 01:05:14 involved in it and to be able to you know identify when someone's
01:05:14 --> 01:05:18 selling you a lie to be able to identify you know who might actually know what
01:05:18 --> 01:05:22 they're talking about in this area to be able to look at a scenario look at
01:05:22 --> 01:05:27 a situation see a new story and say hey that's a problem because of this the
01:05:27 --> 01:05:30 way they're using ai here is a problem because of these principles,
01:05:31 --> 01:05:34 So the book tries to teach you how AI works.
01:05:34 --> 01:05:38 Like we talk about the actual structures of it. We talk about how these structures
01:05:38 --> 01:05:41 cause issues, how they interact with social processes.
01:05:42 --> 01:05:48 But the specifics of the mathematics, those aren't as important for the general
01:05:48 --> 01:05:49 knowledge understanding.
01:05:49 --> 01:05:52 They aren't as important for your everyday person and what I feel like they
01:05:52 --> 01:05:55 need to effectively participate in democracy.
01:05:56 --> 01:06:01 Yeah. Yeah. Well, I'm glad you took that approach because I hated math.
01:06:02 --> 01:06:06 And my dad was a math major, and he would get frustrated. It's like,
01:06:07 --> 01:06:08 why are you not getting this?
01:06:08 --> 01:06:10 It's like, because I don't like it. He said, well, you got to like it because
01:06:10 --> 01:06:12 you got to pass, all that stuff.
01:06:12 --> 01:06:21 But he would have got real heavy into the math part of the whole thing, but not me, not so much.
01:06:22 --> 01:06:26 I think there's an actual moment in the book too where i am talking about something
01:06:26 --> 01:06:30 and i say don't worry we're not going to go into the math on this and then i
01:06:30 --> 01:06:35 felt like i also had to add parentheses possibly because i don't understand it either.
01:06:37 --> 01:06:45 All right. So how dangerous is systemic bias in facial recognition?
01:06:46 --> 01:06:50 Massively dangerous. It is one of the biggest issues with AI.
01:06:51 --> 01:06:55 And it's been there from the beginning. And even as it gets better,
01:06:55 --> 01:06:58 we still have these issues and they continue to come up.
01:06:58 --> 01:07:02 So for your listeners who might not understand how the systemic bias works,
01:07:02 --> 01:07:05 you know, with AI is trained on a data set so when
01:07:05 --> 01:07:08 you're doing that data set so there's kind of two ways in a sense this
01:07:08 --> 01:07:11 can happen one is just like a data set made for say
01:07:11 --> 01:07:14 recognizing faces if that data set
01:07:14 --> 01:07:17 like the original data sets which were used done in silicon
01:07:17 --> 01:07:21 valley are heavily you know say white people it
01:07:21 --> 01:07:24 will have a difficult time identifying people of different races
01:07:24 --> 01:07:27 this associates in
01:07:27 --> 01:07:30 many different ways because you have one thing if like the system
01:07:30 --> 01:07:32 might not be that good at associating people from one group or another
01:07:32 --> 01:07:35 so it can be shown to be you know very successful in
01:07:35 --> 01:07:38 one place but then it confuses two people together so in
01:07:38 --> 01:07:44 stuff like criminal justice applications that has massive issues we also see
01:07:44 --> 01:07:48 it where people tend to think about facial recognition in the sense of you know
01:07:48 --> 01:07:52 we're going to take one person take their photo say who is this and we'll find
01:07:52 --> 01:07:55 who it is so misidentification is a huge problem there.
01:07:56 --> 01:08:01 But facial recognition extends beyond that. There's ideas of using facial recognition
01:08:01 --> 01:08:05 to, you know, detect someone's mood, to detect, you know, to say,
01:08:05 --> 01:08:08 this person's being aggressive or things like that.
01:08:08 --> 01:08:14 Again, you have massive bias implication there because these are not objective
01:08:14 --> 01:08:16 ideas. These are social ideas.
01:08:16 --> 01:08:21 So if you've trained, say, the system in one region where the way that someone
01:08:21 --> 01:08:24 shows, you know, aggressiveness on their face has a very, you know,
01:08:24 --> 01:08:28 pronounced kind of idea of and that's how it is in one culture,
01:08:28 --> 01:08:30 that might not be the same in another.
01:08:30 --> 01:08:34 So it could easily read onto this different thing.
01:08:34 --> 01:08:38 So a system that's meant to recognize if someone's being aggressive, that can be a problem.
01:08:38 --> 01:08:42 I was at a company that was showing that they actually have facial recognition
01:08:42 --> 01:08:46 systems on their computer now, that when dealing with sensitive documents,
01:08:46 --> 01:08:51 it is looking to see if you seem upset or agitated, and it will actually restrict
01:08:51 --> 01:08:55 your access to sensitive documents if you're upset or agitated out of a fear
01:08:55 --> 01:08:59 that you're going to be stealing them or taking some sort of retribution against the company.
01:09:00 --> 01:09:03 And just in a cultural sense.
01:09:04 --> 01:09:08 It doesn't work like that. Like we don't have that ability to adequately do
01:09:08 --> 01:09:09 that because of the bias in the data.
01:09:10 --> 01:09:15 So yeah, there are huge problems, especially when it comes to policing in the United States.
01:09:15 --> 01:09:18 We see a lot of it used in policing systems there.
01:09:19 --> 01:09:23 It's being used way more in England right now. I have a friend doing a lot of
01:09:23 --> 01:09:25 research on facial recognition systems in England.
01:09:26 --> 01:09:32 It's just a whole can of worms that has an entire subset of academia looking at it.
01:09:33 --> 01:09:35 And most of the best researchers that I know all kind of say,
01:09:35 --> 01:09:40 we should not be using this for anything important because it's just going to
01:09:40 --> 01:09:42 cause more problems than it's worth.
01:09:43 --> 01:09:47 Yeah. I, you know, I know about the, the policing thing.
01:09:47 --> 01:09:52 That's if you watch any cop show that's based out of Britain,
01:09:52 --> 01:09:57 every cop, it doesn't matter if they're a normal B cop or they're the head of Scotland Yard.
01:09:57 --> 01:10:01 Everybody's like, check the facial recognition. You know what I'm saying?
01:10:01 --> 01:10:06 And so it's like, I know it's a big deal in London, but, and it's something
01:10:06 --> 01:10:08 that's being incorporated in the United States.
01:10:09 --> 01:10:13 And the young lady I was talking about, Kem-Laurin Lubin, she'd written a book
01:10:13 --> 01:10:14 called Design Heuristics.
01:10:15 --> 01:10:21 And she talks about, you know, tries to talk about how we can get to a better
01:10:21 --> 01:10:25 way of dealing with those kind of biases and stuff.
01:10:26 --> 01:10:29 And there's a lot of work being done on it because it's important.
01:10:29 --> 01:10:32 And but like one example, I like to throw it on these biases,
01:10:32 --> 01:10:36 too, though, which is interesting was I believe it was California tried to use
01:10:36 --> 01:10:41 an AI system to do sentencing or I think it was it might have been bail.
01:10:41 --> 01:10:45 It was either bail or sentencing, one of the two. And the idea was they were
01:10:45 --> 01:10:46 trying to use it to be less biased.
01:10:47 --> 01:10:50 They were like, instead of having a human, which is going to use their judgment,
01:10:50 --> 01:10:54 we're going to have an AI system do it that will be objective that, you know, won't judge.
01:10:55 --> 01:10:58 And what they found when they audited it was that the system was being racist.
01:10:58 --> 01:11:02 It was, you know, having harsher penalties against minorities.
01:11:02 --> 01:11:07 And it comes from the principle of bias of garbage data in garbage outputs,
01:11:07 --> 01:11:11 because the data that these systems were learning from, again,
01:11:11 --> 01:11:15 this is why we say AI is very human, is data that we chose to collect.
01:11:15 --> 01:11:18 It's data that reflects our actions. So it's reflecting us.
01:11:19 --> 01:11:23 So if we have racist data, we're going to have racist outcomes.
01:11:23 --> 01:11:28 Yeah, so let me ask this question. So how is data the new oil?
01:11:30 --> 01:11:34 It's an interesting one because on one hand, I actually reject that idea.
01:11:35 --> 01:11:37 But the other hand, it is kind of true.
01:11:37 --> 01:11:43 So data is the new oil, as people like to say, because it's incredibly valuable right now.
01:11:43 --> 01:11:47 It is the raw resource that allows AI systems to work.
01:11:47 --> 01:11:50 And it's it's interesting because it both fuels the
01:11:50 --> 01:11:53 creation of ai system and acts as the gasoline that
01:11:53 --> 01:11:56 powers them so when you're training an ai system like
01:11:56 --> 01:11:59 we're talking about there you need the data for it to make its
01:11:59 --> 01:12:02 inferences from to develop its algorithm but then
01:12:02 --> 01:12:07 for it to actually act it also needs data coming in so real-time data processing
01:12:07 --> 01:12:11 it needs you know to be observing something so even like no facial recognition
01:12:11 --> 01:12:16 the data is the video camera is producing this data which is being interpreted
01:12:16 --> 01:12:20 by an AI system designed and built from data.
01:12:20 --> 01:12:25 So data is just so important. Having a rich data set to train from is incredibly
01:12:25 --> 01:12:27 important and incredibly valuable.
01:12:27 --> 01:12:32 It's why Google is such a valuable company and is actually far more valuable
01:12:32 --> 01:12:36 than they show in their balance sheets because they have more data than anybody else in the world.
01:12:36 --> 01:12:40 And that doesn't come up in their balance sheets anywhere. There's nowhere on
01:12:40 --> 01:12:42 their financial statements that list the value for that data,
01:12:42 --> 01:12:47 because we just don't even know how to value it. We just know it's worth a lot.
01:12:48 --> 01:12:52 The reason I say it's not the new oil, though, is because data and oil are completely
01:12:52 --> 01:12:54 different in how they act as resources.
01:12:55 --> 01:12:58 When we talk about oil, oil is non-rival.
01:12:59 --> 01:13:03 If I have a barrel of oil, only I can burn it. You can't also use it.
01:13:04 --> 01:13:07 You know, if you want to use it, you have to buy it from me. Only one of us can use it.
01:13:07 --> 01:13:12 But if I have a hard drive full of data, and we both want to train AIs,
01:13:12 --> 01:13:17 I can train my AI on the data and hand it over to you and you can train your AI.
01:13:17 --> 01:13:19 It's not diminished by that so i kind
01:13:19 --> 01:13:23 of reject this data as a new oil because economically it
01:13:23 --> 01:13:27 works completely different it's a a fascinating
01:13:27 --> 01:13:32 situation where when we talk about as oil we make this mistake of kind of this
01:13:32 --> 01:13:36 idea of why we created economies like capitalist economies of you know we have
01:13:36 --> 01:13:40 limited resources we need to figure out how to distribute them what's the best
01:13:40 --> 01:13:44 way to do it so you know we distribute it that way but like with something like
01:13:44 --> 01:13:47 data there's questions about, well,
01:13:47 --> 01:13:50 everybody can use it. This could just be a free good.
01:13:50 --> 01:13:53 It's completely different. And that's something governments are really struggling
01:13:53 --> 01:13:58 with right now in understanding how we should build data economies.
01:13:58 --> 01:14:04 So since you mentioned Google, you state that Google search is a farmer's surveillance.
01:14:04 --> 01:14:08 Why did you feel it was important for your readers to know that?
01:14:08 --> 01:14:12 Because everybody jokes about that, right? They'll say, oh, well, you know.
01:14:13 --> 01:14:16 You know, if you put too much information in there, you know,
01:14:16 --> 01:14:18 they, you know, they spying on you and stuff.
01:14:18 --> 01:14:23 And then, of course, you know, we'll see whatever we research or whatever.
01:14:24 --> 01:14:28 Then it's like all of a sudden we start getting ads. So kind of explain how
01:14:28 --> 01:14:30 Google searches a form of surveillance.
01:14:31 --> 01:14:36 So it's not actually even just Google search. It's the entire suite of the Google infrastructure.
01:14:37 --> 01:14:40 So we call Google a surveillance advertising company.
01:14:41 --> 01:14:44 And you might think that your ads come purely from your Google search.
01:14:45 --> 01:14:48 But there's this interesting thing. Go to any website. I'm sure if you have
01:14:48 --> 01:14:50 a website, you might not even realize this.
01:14:50 --> 01:14:54 If you scroll down to the bottom and look at it, I can't remember the last stat.
01:14:54 --> 01:14:58 It sounded like 75% of the websites on the internet run off Google Analytics.
01:14:59 --> 01:15:04 What that means is whenever a user is on your website, Google can see what they're doing.
01:15:04 --> 01:15:08 All of that is data that they can see as well. They can see those actions along
01:15:08 --> 01:15:09 with the searches they're making.
01:15:10 --> 01:15:13 To build a bigger profile on that user.
01:15:14 --> 01:15:17 And then they use that profile to run through, you know, advertising algorithms.
01:15:18 --> 01:15:21 But they're creating this knowledge of who you are, which is,
01:15:21 --> 01:15:25 I think, really important to understand kind of this economic engine that we're
01:15:25 --> 01:15:29 in right now of surveillance, of that they are surveilling you,
01:15:29 --> 01:15:30 they're producing data about you.
01:15:30 --> 01:15:33 And then that is something that they're using to create value,
01:15:33 --> 01:15:37 whether that's advertising or now we're seeing this data used to create AI systems.
01:15:38 --> 01:15:42 So I think it's really important to recognize that what's going on here is surveillance
01:15:42 --> 01:15:44 and to speak about it in that way.
01:15:45 --> 01:15:50 Because once we kind of recognize that reality, we gain a better ability to
01:15:50 --> 01:15:53 kind of regulate it and say, how do we want to regulate this?
01:15:53 --> 01:15:54 How do we feel about this?
01:15:55 --> 01:16:00 And I think it's important to kind of build an understanding of the power of these systems.
01:16:00 --> 01:16:04 And I don't want to say, you know, I might be a bit more radical on the privacy side with this.
01:16:04 --> 01:16:08 Not everybody needs to be. Some people might be more comfortable with this.
01:16:08 --> 01:16:12 I have family members who know all this information, who are very comfortable with it.
01:16:12 --> 01:16:18 But I think we need that knowledge to have these conversations and decide what we as a society want.
01:16:19 --> 01:16:25 Yeah, because you basically make the argument that the legislation that says, okay,
01:16:26 --> 01:16:32 well, you can opt out of putting in your personal data is not even really scratching
01:16:32 --> 01:16:37 the surface as far as how these companies like, you know,
01:16:38 --> 01:16:43 Google and I guess AWS, how they can get information.
01:16:43 --> 01:16:47 They don't necessarily need your personal data to get the information they need
01:16:47 --> 01:16:50 to cater to or market to you.
01:16:52 --> 01:16:55 Yeah, and that really is an interesting one because there's two sides to this.
01:16:55 --> 01:16:59 One is that in the old economy, which was surveillance advertising,
01:17:00 --> 01:17:03 they did need a certain level of personal data.
01:17:03 --> 01:17:04 And that's why we started regulating
01:17:04 --> 01:17:07 it. Because they needed to know stuff about you to market to you.
01:17:08 --> 01:17:14 In the new AI economy, the valuable data is what was not that valuable in that economy.
01:17:14 --> 01:17:16 The valuable data is just like everything.
01:17:17 --> 01:17:20 They're just trying to sweep up everything. personal data is interesting but
01:17:20 --> 01:17:23 it doesn't cover everything so now they're
01:17:23 --> 01:17:26 kind of trying to segment off and being like hey we'll let you pass privacy laws
01:17:26 --> 01:17:30 that focus on personal data so you can feel safe as long
01:17:30 --> 01:17:34 as you let us have all this other stuff that like personal data
01:17:34 --> 01:17:36 just means that they can't identify you from it it could
01:17:36 --> 01:17:40 still be like you know your heart rate from your smartwatch
01:17:40 --> 01:17:43 there are countries in the world where that's not considered personal data you
01:17:43 --> 01:17:46 know what you're doing on websites they'll just
01:17:46 --> 01:17:48 cut off well david's name isn't attached to
01:17:48 --> 01:17:52 it so you know we can use this information what
01:17:52 --> 01:17:54 he did because they're not looking to market to me they're looking
01:17:54 --> 01:18:00 to aggregate that into ai systems an interesting one is how that has moved into
01:18:00 --> 01:18:05 marketing algorithms a really interesting specific one that i worked on was
01:18:05 --> 01:18:09 a proposal by google where they were like you know what we're going to stop
01:18:09 --> 01:18:12 collecting all this personal data we're going to stop doing that And instead,
01:18:12 --> 01:18:16 they came up with an algorithm that would basically hop into your computer,
01:18:16 --> 01:18:18 I think it was like once a week.
01:18:18 --> 01:18:23 Look at your last five URLs you had visited, and from that, create a profile
01:18:23 --> 01:18:26 of who you were based on just your last five searches.
01:18:26 --> 01:18:29 That was far more accurate at predicting
01:18:29 --> 01:18:34 like who you were what you wanted to see what your feelings were that period
01:18:34 --> 01:18:39 of time then they had ever got from doing like this massive personalized version
01:18:39 --> 01:18:43 where they had to collect all this data which just shows the power of ai there
01:18:43 --> 01:18:48 will ai create a new luddite fallacy in our society.
01:18:49 --> 01:18:54 It's an interesting proposition because the Luddite fallacy,
01:18:54 --> 01:18:59 right, is this idea that, well, the fallacy is the idea that the,
01:18:59 --> 01:19:03 or I need to hop back and remember which one specifically the Luddite fallacy is.
01:19:04 --> 01:19:08 Because the Luddites are the people who rejected the technology or are seen
01:19:08 --> 01:19:11 as being the ones who, you know, rejected the technology.
01:19:11 --> 01:19:17 But then, oh, I'm trying to work through the economic fallacy in my head.
01:19:17 --> 01:19:20 Because yeah the fallacy is that they rejected new
01:19:20 --> 01:19:24 technologies of the industrial revolution and didn't recognize
01:19:24 --> 01:19:27 that those technologies would help them in the
01:19:27 --> 01:19:33 long run overall you know would raise economic standings it is also based off
01:19:33 --> 01:19:38 a misunderstanding of the luddites that the luddites they didn't reject the
01:19:38 --> 01:19:42 technology they were actually fine with it they rejected how it was being integrated
01:19:42 --> 01:19:46 into society and what they were saying is that this is destroying our communities,
01:19:46 --> 01:19:48 So the Luddites were mainly skilled artisans.
01:19:49 --> 01:19:51 So when they saw the technology come in, they're like, whoa,
01:19:51 --> 01:19:55 we can produce so much better stuff now. We can make much better products for
01:19:55 --> 01:19:56 people and we can mass produce it.
01:19:56 --> 01:20:00 But the factory owner said, no, we're firing all the artisans and we're mass
01:20:00 --> 01:20:03 producing cheap things and we're going to get more profit from that.
01:20:04 --> 01:20:07 That's what the Luddites were upset about. They saw it as this destruction of
01:20:07 --> 01:20:11 agricultural work, the forcing of people into cities, into factories.
01:20:11 --> 01:20:16 So what they were more concerned about was a quality of living and a quality of life in their time.
01:20:17 --> 01:20:20 So I think what we could see is, yeah, that rise again, where,
01:20:21 --> 01:20:23 you know, we see it right now, like with the economy, right,
01:20:24 --> 01:20:27 where that line's going up, the stock market's going up.
01:20:27 --> 01:20:30 And this is kind of like what that fallacy is talking about.
01:20:31 --> 01:20:34 You know, look, everything's going great. Economic performance is awesome.
01:20:34 --> 01:20:38 But that doesn't account for the lived experiences of people within it.
01:20:38 --> 01:20:43 It doesn't account for the fact that the Luddites were experiencing a massive trauma.
01:20:43 --> 01:20:46 They were having their livelihoods and their jobs taken from them.
01:20:46 --> 01:20:48 And that's what they were pushing back against.
01:20:48 --> 01:20:52 They were saying, we need to do this in a humane way. We approve of this technology.
01:20:52 --> 01:20:55 We agree it can create massive economic benefit.
01:20:55 --> 01:20:58 We need to implement it in a way that's humane. And I think,
01:20:58 --> 01:21:03 you know, with AI, we do risk dealing with the same problems there because,
01:21:03 --> 01:21:06 you know, people are like, oh, it's going to be so much better for productivity, all of this.
01:21:06 --> 01:21:10 But you have situations where you know a 40
01:21:10 --> 01:21:12 year old father who's been in a
01:21:12 --> 01:21:15 field you know has his education might get
01:21:15 --> 01:21:18 laid off and they'll say sorry you're redundant now ai can
01:21:18 --> 01:21:23 do this what is that person going to do you know they're so far into their career
01:21:23 --> 01:21:27 retooling is difficult do they go back to school do they accept a job that's
01:21:27 --> 01:21:31 now much under their salary of what they had before and we've seen this happen
01:21:31 --> 01:21:36 with the rust belt states in the u.s right and we've seen the effect this has
01:21:36 --> 01:21:37 on people when they are de-skilled,
01:21:38 --> 01:21:41 when you have these massive automation capabilities.
01:21:41 --> 01:21:43 And I think that's something we really need to be worried about.
01:21:43 --> 01:21:48 So in a political sense, one of the things I really advocate for is we don't
01:21:48 --> 01:21:51 know how difficult this transition is going to be.
01:21:51 --> 01:21:57 We still don't know how big the AI job transformation is going to be. We know it will exist.
01:21:58 --> 01:22:02 We need to prepare for that worst case scenario. We need to prepare and build
01:22:02 --> 01:22:06 social safety nets to build understandings that like if people are being de-skilled,
01:22:06 --> 01:22:08 you know, in their 40s, how are we going to deal with that?
01:22:08 --> 01:22:11 That'll be different than people who just went to university,
01:22:11 --> 01:22:15 got a business degree, paid $100 for it, you know, have now had basically
01:22:15 --> 01:22:17 all their knowledge wiped out.
01:22:17 --> 01:22:21 What do we do about those people? These are questions we need to be asking to
01:22:21 --> 01:22:26 figure out like what systems could we have that reduce the harm to those people
01:22:26 --> 01:22:27 from that because that harm is real.
01:22:28 --> 01:22:31 That harm has massive social effects, massive psychological effects,
01:22:32 --> 01:22:34 and very long-term effects.
01:22:34 --> 01:22:39 As someone who personally was affected by the 2008 recession with my family,
01:22:39 --> 01:22:43 I can speak to being a kid that lives in one of those families that experiences that.
01:22:43 --> 01:22:49 As we've seen from the Rust Belt states, we can see the political shift it can cause.
01:22:49 --> 01:22:52 We can see the resentment that it can cause.
01:22:52 --> 01:22:57 So I think it's something that we as a society need to be preparing for and
01:22:57 --> 01:23:01 need to be asking difficult questions about, yes, AI is beneficial for society.
01:23:01 --> 01:23:04 There's no stopping the train. It's left the station.
01:23:04 --> 01:23:10 How do we make sure it runs smoothly? Yeah, because Nike just, for example,
01:23:11 --> 01:23:17 made a decision that they were going to lay off like 700-some employees and
01:23:17 --> 01:23:20 they were saying that AI was going to handle.
01:23:21 --> 01:23:27 Stuff that they will handle. And I guess it's going to be a hybrid that some
01:23:27 --> 01:23:29 of the mundane things AI can do,
01:23:29 --> 01:23:34 but some of the things that require human touch, they'll stay,
01:23:34 --> 01:23:36 which I'm trying to envision in my mind.
01:23:36 --> 01:23:42 It's like, okay, well, as far as ringing up the items, AI will do that some kind of way.
01:23:43 --> 01:23:49 But as far as the sales pitch and getting you to buy these particular shoes
01:23:49 --> 01:23:53 or whatever, you still need that human creativity to make that happen.
01:23:53 --> 01:23:58 I don't know what their plan is and how that's going to work,
01:23:58 --> 01:24:05 but that just highlights to me what you just addressed in that.
01:24:05 --> 01:24:12 A couple more questions. What particular application for AI are you most excited about?
01:24:12 --> 01:24:17 I think I'm really excited about it in accessibility and healthcare.
01:24:17 --> 01:24:21 I think those are two areas. I mean, right now, healthcare really is the big one.
01:24:22 --> 01:24:25 And the major reason I'm really excited about that is not just the advances
01:24:25 --> 01:24:30 that we are making, but also the structure of the healthcare system and why
01:24:30 --> 01:24:33 it's actually a really well-structured space for automation.
01:24:34 --> 01:24:39 Because the advances are amazing in reading radiology scans.
01:24:39 --> 01:24:44 And I was just at a rural health care center in Saskatchewan that's implementing
01:24:44 --> 01:24:49 AI to help bring accessible health care to rural communities that are run in
01:24:49 --> 01:24:50 centralized locations.
01:24:51 --> 01:24:55 Just amazing technologies here. And what I think is interesting about it is
01:24:55 --> 01:25:00 when we're applying it in these situations, it's about expanding.
01:25:00 --> 01:25:03 It's about expanding the health care offerings we already have,
01:25:03 --> 01:25:07 expanding them to more regions, trying to bring them to people that don't already have them.
01:25:07 --> 01:25:10 It tends to be less about replacing something with a worse product.
01:25:11 --> 01:25:14 So we end up seeing this situation where you know
01:25:14 --> 01:25:18 because it's heavily regulated because health care is incredibly heavily regulated
01:25:18 --> 01:25:24 you don't see you see more kind of skepticism and you see more testing of systems
01:25:24 --> 01:25:28 before they're implemented because the cost of messing up is so much higher
01:25:28 --> 01:25:33 and the regulatory cost of messing up is very high so you see much better systems being used,
01:25:34 --> 01:25:39 on top of that we don't tend to see true automation and what i mean in that
01:25:39 --> 01:25:41 sense is if you are a doctor.
01:25:41 --> 01:25:45 And so if you're a radiologist, let's say, and they bring in,
01:25:45 --> 01:25:47 you know, an AI technology to read scans for you.
01:25:48 --> 01:25:51 In other fields, as we see, like maybe with Nike, they bring in that technology
01:25:51 --> 01:25:54 and they go, lay off, lay off the person.
01:25:55 --> 01:25:59 We are in a doctor shortage. Doctors already are not doing all the work they
01:25:59 --> 01:26:01 could be doing. There's so much more that could be done.
01:26:02 --> 01:26:04 So when you bring in something like this that can read the scans,
01:26:04 --> 01:26:06 you're liberating their time.
01:26:06 --> 01:26:10 You're giving them more time to engage in other activities. So now maybe that
01:26:10 --> 01:26:12 radiologist will treat more patients.
01:26:12 --> 01:26:16 Maybe that radiologist will spend more time on research, helping to design new
01:26:16 --> 01:26:19 cancer detection tools or new cancer treatment tools.
01:26:20 --> 01:26:25 So I think the healthcare implications are massive and I think really exciting
01:26:25 --> 01:26:29 both in the technological developments and in just the structure that the healthcare
01:26:29 --> 01:26:33 system has to be able to implement them in very humane ways.
01:26:34 --> 01:26:37 What do you want the readers to take from this book?
01:26:38 --> 01:26:42 Really want this book to be a starting point for readers i want you to start
01:26:42 --> 01:26:46 here and not end here i want to be a place that introduces you to the things
01:26:46 --> 01:26:49 you need to know but also that's empowering,
01:26:49 --> 01:26:55 because i think a lot of talk about ai is very doom and gloom i talked before about you know,
01:26:55 --> 01:26:59 that i felt like my friends were seeing not great information the other side
01:26:59 --> 01:27:04 of that coin was information that was just you know doomerism that was just
01:27:04 --> 01:27:07 you know ai is going to take over where AI is going to destroy everything,
01:27:07 --> 01:27:08 it's this all-powerful force.
01:27:09 --> 01:27:11 I want people to come out with a bit of optimism.
01:27:12 --> 01:27:16 Or optimism might be the wrong word. The last chapter is called Hope,
01:27:16 --> 01:27:18 and I think there's a difference between hope and optimism.
01:27:19 --> 01:27:21 And hope is the belief that there's still a good path forward.
01:27:22 --> 01:27:23 Hope is the last thing we lose.
01:27:24 --> 01:27:28 And I hope that people come out with a sense of hope, not only that the future
01:27:28 --> 01:27:32 with AI can be better, but hope that they can make a difference,
01:27:32 --> 01:27:36 and hope that they can understand it, a realization that we get to shape this future.
01:27:37 --> 01:27:40 And it's not one person who gets to make the decisions here.
01:27:40 --> 01:27:43 The future comes from us. The future comes from democracy.
01:27:43 --> 01:27:48 The future comes from us working together and demanding a better future.
01:27:48 --> 01:27:52 So I hope you gain the tools from this book to advocate for yourself.
01:27:52 --> 01:27:57 You gain the tools from this book to be able to speak to your own experiences
01:27:57 --> 01:27:59 and to be able to ask for more.
01:27:59 --> 01:28:03 And you become a part of this great conversation, which is, I think,
01:28:03 --> 01:28:05 the most important conversation of our time.
01:28:05 --> 01:28:12 So it's funny you bring up hope because one of the questions I'm asking every
01:28:12 --> 01:28:16 guest as we close out is to finish this sentence.
01:28:17 --> 01:28:19 I have hope because...
01:28:21 --> 01:28:25 I have hope because there's still many decisions left to be made.
01:28:26 --> 01:28:30 Okay. All right. It's succinct and to the point.
01:28:31 --> 01:28:35 David, how can people get this book, Artificially Intelligent,
01:28:35 --> 01:28:39 A Very Human Story, and how can they reach out to you?
01:28:40 --> 01:28:43 So you can find Artificially Intelligent anywhere books are sold.
01:28:43 --> 01:28:49 You can go to your local bookshop, Barnes & Noble, order it online through bookshops.org.
01:28:49 --> 01:28:52 It's a great place if you're looking to order it. supports local bookstores.
01:28:53 --> 01:28:56 Get it through Barnes & Noble, get it through Amazon. You can get it directly
01:28:56 --> 01:28:59 through my publisher, University of Toronto Press.
01:28:59 --> 01:29:03 I think they have a big sale on right now actually to celebrate their birthday.
01:29:04 --> 01:29:05 So you could go check that out.
01:29:05 --> 01:29:08 And you can find me at davideliot.org.
01:29:08 --> 01:29:13 Eliot has one L and one T in it. I will be spelling that out for the rest of my life.
01:29:13 --> 01:29:17 So davideliot1l1t.org. I have an email form on there.
01:29:17 --> 01:29:21 If you want to send me an email, get in touch with me and I hope to pick up
01:29:21 --> 01:29:22 the book. I hope you enjoy it.
01:29:23 --> 01:29:26 I hope you join the conversation and thank you so much for having me on today, Eric.
01:29:27 --> 01:29:31 Well, David Eliot, it was an honor to have you on and I can relate,
01:29:31 --> 01:29:39 you know, the way I spell Erik seems to be unique to some people instead of with the C is with the K,
01:29:39 --> 01:29:44 but you know, even, even my AI, when it does the transcript.
01:29:44 --> 01:29:46 I have to go in and edit that all the time.
01:29:47 --> 01:29:51 Oh no. But, but David, I'm, I'm really glad that we had this discussion and
01:29:51 --> 01:29:52 I appreciate you coming on.
01:29:52 --> 01:29:56 Perfect. Thank you very much. All right, guys. And we're going to catch y'all on the other side.
01:30:08 --> 01:30:13 All right. And we are back. So I want to thank Kaylee Jade
01:30:14 --> 01:30:17 Peterson and David Eliot for coming on the show.
01:30:19 --> 01:30:23 You know, I wish Kaylee well. She is, you know,
01:30:24 --> 01:30:30 been really steadfast and has made it a crusade to make sure that progressive
01:30:30 --> 01:30:35 voices are heard and are organized in Idaho.
01:30:35 --> 01:30:38 And I know a lot of people are like, really?
01:30:39 --> 01:30:41 Is that really a worthwhile endeavor?
01:30:42 --> 01:30:48 And yes, it is, especially for her and her family, because she lives there.
01:30:48 --> 01:30:57 And I really hope, I think that if she gets in, she's going to be a positive voice for us.
01:30:58 --> 01:31:05 And it's definitely going to be better than what they got up there already.
01:31:05 --> 01:31:07 So I wish her well. You can...
01:31:09 --> 01:31:13 Just Google her, and if you want to support her, go ahead and do that.
01:31:14 --> 01:31:18 And then David Eliot, young man, very insightful young man,
01:31:18 --> 01:31:23 who basically has written his book, Artificially Intelligent.
01:31:24 --> 01:31:28 And it is really, I learned a lot of stuff.
01:31:28 --> 01:31:35 Even though he is a sociologist by training, and he's done a lot of research
01:31:35 --> 01:31:41 from that capacity dealing with artificial intelligence and its impact and all that.
01:31:41 --> 01:31:43 So he's not a computer guy.
01:31:44 --> 01:31:46 But just like Dr.
01:31:46 --> 01:31:57 Lubin, who came on before, brings a particular perspective, especially from the human side.
01:31:57 --> 01:32:06 But I learned a lot from him as far as like the origin of the word algorithm, you know.
01:32:07 --> 01:32:10 Yeah, I got to get that book. It's pretty. And as a matter of fact,
01:32:10 --> 01:32:16 I mentioned to, if you're in the interview, remember, I mentioned Dr. Lubin's work.
01:32:16 --> 01:32:18 And he was, David was scribbling that down.
01:32:19 --> 01:32:23 And so, Dr. Lubin, I think you're going to have one more person buying your
01:32:23 --> 01:32:27 book for sure. Yes, I just want to thank them for coming on,
01:32:28 --> 01:32:33 especially during this moment that we are in.
01:32:34 --> 01:32:42 So as I'm recording this, a lot of things have happened in Minnesota over the last week.
01:32:43 --> 01:32:50 You know, this is a weekly show, so we kind of, as recording stuff and all that,
01:32:51 --> 01:32:57 things happen right around the time we're recording or I miss it because I'm recording.
01:32:57 --> 01:33:03 But a young man got shot, same age as Renee Good. His name was Alex Pretti.
01:33:03 --> 01:33:07 Alex was an ICU nurse.
01:33:08 --> 01:33:13 And having part of my job when I worked for the Fulton County Sheriff's Office,
01:33:13 --> 01:33:21 I was assigned to Grady Hospital, and I would have to sit on patients that were sent to ICU.
01:33:22 --> 01:33:25 That is a very, very diligent job.
01:33:25 --> 01:33:33 It's not as hectic as being down in the emergency room, but it's still very,
01:33:33 --> 01:33:38 very stressful, very tenuous situation because these people made it from the
01:33:38 --> 01:33:40 emergency room to the ICU.
01:33:41 --> 01:33:46 Their life is still in the balance. And so these people have to be very sensitive,
01:33:47 --> 01:33:52 and very professional and very aware of what's going on with the patients they're assigned.
01:33:53 --> 01:33:59 And there's a video that has gone around of Mr.
01:33:59 --> 01:34:07 Pretti giving basically a eulogy to a veteran who died at the VA hospital in
01:34:07 --> 01:34:09 Minnesota where he worked.
01:34:11 --> 01:34:15 You know, that's just kind of a glimpse of what this young man was.
01:34:15 --> 01:34:18 He was, you know, he was an avid outdoorsman.
01:34:19 --> 01:34:22 He liked to hike and bike and all that stuff.
01:34:23 --> 01:34:31 But he was a pretty active protester of ICE because another video has come out.
01:34:31 --> 01:34:33 And again, these people are so dumb.
01:34:34 --> 01:34:38 And I'm kind of like Tiffany Cross now. You know, everybody else is trying to
01:34:38 --> 01:34:45 be polite and, you know, not say direct things like,
01:34:45 --> 01:34:49 you know, she'll get on CNN and just say, you're lying. Right.
01:34:50 --> 01:34:53 And, you know, saying, well, I don't agree with that or whatever.
01:34:53 --> 01:34:57 No, she just come out and say it. And that's I think that's the way you have
01:34:57 --> 01:35:03 to treat these people, because they take advantage of kindness.
01:35:03 --> 01:35:07 Right. They take advantage of civility. because they have none.
01:35:08 --> 01:35:13 And, you know, so they're bulls in the China shop for real, right?
01:35:14 --> 01:35:20 But they're also stupid people, which usually kind of goes hand in hand with brashness.
01:35:20 --> 01:35:25 And so they released a video talking about.
01:35:26 --> 01:35:30 Well, not talking about, but it was a video showing Mr.
01:35:30 --> 01:35:35 Pretti like 11 days before he got shot at another protest.
01:35:36 --> 01:35:42 And obviously there was some exchange between one of the ICE or Border Patrol
01:35:42 --> 01:35:46 or one of the federal people that was out there.
01:35:46 --> 01:35:50 And he didn't take too kindly to it.
01:35:50 --> 01:35:56 He walked up to the car. I think fluids were exchanged, like spitting.
01:35:57 --> 01:36:02 Then he kicked the taillight out, right, of the car.
01:36:04 --> 01:36:07 And somebody said, wow, he's got to be pretty strong. You know,
01:36:07 --> 01:36:11 cold weather kind of helps, makes things brittle.
01:36:12 --> 01:36:16 So when he kicked the taillight out of the car, the officers swarmed him and
01:36:16 --> 01:36:19 all that, took him down. But he didn't get arrested.
01:36:20 --> 01:36:23 They just kind of subdued him.
01:36:24 --> 01:36:31 And then when he stood up, you could see that he had a gun holstered.
01:36:32 --> 01:36:36 Didn't take the gun from him. They almost acted like they didn't see it.
01:36:37 --> 01:36:45 And, you know, they got up, they walked away, got in their SUV with a damaged
01:36:45 --> 01:36:46 taillight and drove off.
01:36:47 --> 01:36:51 So obviously it was a different group of people that Mr.
01:36:52 --> 01:37:00 Pretti encountered 11 days later when he was recording an action that they were
01:37:00 --> 01:37:05 taking out according to Secretary Noem, which basically is,
01:37:06 --> 01:37:11 you got to fact check her breathing, right? Because she lies that much.
01:37:11 --> 01:37:17 She said that they were about to arrest a child pedophile.
01:37:17 --> 01:37:20 And like some comedian said, is there a difference?
01:37:21 --> 01:37:22 Is there any other kind of pedophile?
01:37:24 --> 01:37:28 But they it was this person
01:37:28 --> 01:37:33 they person of interest they were after somebody that had been accused of being
01:37:33 --> 01:37:39 pedophile and so the story that they're telling is that he impeded now initially
01:37:39 --> 01:37:45 they said that he came in had the gun drawn was basically doing his best Edward G.
01:37:46 --> 01:37:51 Robinson Jimmy Cagney imitation and say I'm gonna take you coppers out you see
01:37:51 --> 01:37:56 I mean just to the extreme, right, when it was totally opposite.
01:37:56 --> 01:38:00 There was a woman who was beside Mr.
01:38:00 --> 01:38:05 Pretti who was also videotaping, and obviously she was saying some things,
01:38:05 --> 01:38:09 and one of the officers got offended and pushed her down in the snow.
01:38:10 --> 01:38:15 Mr. Pretti went to go pick her up. And in the process of picking her up,
01:38:16 --> 01:38:20 another officer came and pepper sprayed her and Mr. Pretti.
01:38:22 --> 01:38:25 And needless to say, he didn't take too kindly to that.
01:38:26 --> 01:38:34 But before he really could do anything, they grabbed him and started beating
01:38:34 --> 01:38:36 him and holding him down.
01:38:36 --> 01:38:42 And then one of the officers saw the gun this time.
01:38:42 --> 01:38:47 One officer saw the gun in his holster and pulled it out.
01:38:49 --> 01:38:56 Now, the audio is terrible, so you can't really hear anything but the gunshots, clearly.
01:38:58 --> 01:39:03 But I'm sure the officer said something to the effect is, I have the gun.
01:39:03 --> 01:39:12 Now, I am believing that the officers heard the word gun.
01:39:14 --> 01:39:17 And at first I thought it was...
01:39:19 --> 01:39:23 Thought it was like somebody accidentally might have, you know,
01:39:23 --> 01:39:26 fired off a round and all that commotion and chaos.
01:39:27 --> 01:39:31 And, you know, and then more shots were fired. That wasn't the case.
01:39:32 --> 01:39:38 It was like Mr. Pretti got up or he looked like he was about to get up and officers
01:39:38 --> 01:39:41 shot him directly in the back.
01:39:42 --> 01:39:48 And not once, not twice, but four times. And as he was rolling over from that,
01:39:49 --> 01:39:52 he got shot five or six more times.
01:39:52 --> 01:39:56 I'm still waiting on the autopsy to see how many bullets actually hit him,
01:39:56 --> 01:39:58 but at least 10 shots were fired.
01:39:59 --> 01:40:01 So basically they killed him.
01:40:02 --> 01:40:10 And, you know, the scenario I gave would probably be the best scenario they
01:40:10 --> 01:40:11 would have in their defense.
01:40:12 --> 01:40:21 But, you know, even if the person had a weapon, once you have them subdued like
01:40:21 --> 01:40:26 that, you know, if you have their arms, they can't shoot you.
01:40:26 --> 01:40:32 They can't. I have yet to see a human being who is being held down by police.
01:40:33 --> 01:40:37 Both of their arms are held down and they're able to shoot somebody.
01:40:37 --> 01:40:39 I have never seen that happen.
01:40:41 --> 01:40:45 In the wildest Ripley, believe it or not, moment. I have never seen that happen.
01:40:45 --> 01:40:53 And so for that officer to fire the weapon, again, just bad policing.
01:40:54 --> 01:41:00 And the way that he did it, he's going to get a murder charge if the state brings
01:41:00 --> 01:41:02 charges, and they should.
01:41:03 --> 01:41:11 He might get off with manslaughter, but he shot an unarmed man at that point.
01:41:12 --> 01:41:17 A person that was in control, a person that was subdued.
01:41:18 --> 01:41:22 The only thing they should have done at that point was put handcuffs on them,
01:41:22 --> 01:41:23 if that was their intention.
01:41:26 --> 01:41:30 So there's that. And then, you know, we've been trying to keep y'all abreast
01:41:30 --> 01:41:33 of what happened with, you know,
01:41:34 --> 01:41:38 Nekima Levy Armstrong, who's been on the show a couple of times and a couple
01:41:38 --> 01:41:43 of other folks that had organized a protest at a church where the leader of
01:41:43 --> 01:41:47 ICE in the state of Minnesota is a pastor.
01:41:48 --> 01:41:52 And we know that she got arrested and then they used AI.
01:41:53 --> 01:41:57 It seemed like she was crying when she got arrested, which wasn't the case.
01:41:58 --> 01:42:02 Thank goodness for cameras everywhere because the camera in her apartment complex
01:42:02 --> 01:42:08 showed somebody that was not crying and the cameras at the courthouse.
01:42:08 --> 01:42:15 At no time was she boo-hooing like they depicted and put on the actual White House website, right?
01:42:17 --> 01:42:22 So she got released. Her, all three of them got released.
01:42:23 --> 01:42:28 Now, as I'm recording, two of the reporters that were there,
01:42:28 --> 01:42:36 Georgia Fort, who is an independent journalist, like I said in the intro, out of Minnesota.
01:42:36 --> 01:42:42 She was like a local anchor there and basically started her own news service.
01:42:42 --> 01:42:50 And, of course, she's been covering, you know, any and everything dealing with
01:42:50 --> 01:42:57 the protests and all that, and she knows Nekima.
01:42:57 --> 01:43:07 And so she was at the first press conference Nekima and the coalition had right
01:43:07 --> 01:43:10 after Renee Good got killed.
01:43:12 --> 01:43:14 And, you know, just been keeping track.
01:43:14 --> 01:43:19 He's been keeping track ever since ICE showed up in force in Minneapolis.
01:43:19 --> 01:43:25 And so Don Lemon has, you know, he just gets on a plane and he goes wherever he wants to go.
01:43:26 --> 01:43:29 If he's not doing man-in-the-street stuff, he's going where the action is.
01:43:29 --> 01:43:31 So, of course, he was in Minneapolis.
01:43:32 --> 01:43:34 He got wind of this protest.
01:43:34 --> 01:43:38 Well, he got wind that some activity was getting ready to happen.
01:43:38 --> 01:43:44 And he kind of explains that if you follow him. He explained what was going to happen.
01:43:45 --> 01:43:49 He knew, had no idea what they were going to do. They just knew some kind of
01:43:49 --> 01:43:50 action was going to take place.
01:43:51 --> 01:43:57 And so when Nekima and the group went in to protest, he and George and everybody else.
01:43:59 --> 01:44:04 Journalists that were covering it went in with him. And, you know, Don,
01:44:04 --> 01:44:10 being the aggressive person he is, he basically kind of got the pastor to the
01:44:10 --> 01:44:14 side and who was the guest pastor, I guess, or the assistant pastor.
01:44:14 --> 01:44:16 He wasn't the guy that was over ice.
01:44:17 --> 01:44:21 But, you know, he started interviewing him. And then he interviewed like two
01:44:21 --> 01:44:26 or three parishioners of the church, you know, afterwards. And,
01:44:26 --> 01:44:28 of course, he interviewed some of the protesters.
01:44:30 --> 01:44:38 So somehow, Ms. Dillon, who is over the Civil Rights Division of the Department
01:44:38 --> 01:44:43 of Justice now, which is clearly an oxymoron in the Trump universe,
01:44:44 --> 01:44:50 her and Pam Bondi, the Attorney General, decided Don Lemon was the ringleader.
01:44:52 --> 01:44:56 And so they decided to go after him,
01:44:56 --> 01:45:00 and then Georgia, they went after and then
01:45:00 --> 01:45:05 they arrested Georgia as well because initially they couldn't get any charges
01:45:05 --> 01:45:11 when they got Nekima and the other organizers they couldn't get an indictment
01:45:11 --> 01:45:19 on Don and I guess they kept working on it and doctored it up I'm sure Pam was putting pressure on Ms.
01:45:19 --> 01:45:21 Dillon to come up with something.
01:45:22 --> 01:45:32 And so they got a grand jury to indict them, federal grand jury to indict them, and they arrested them.
01:45:34 --> 01:45:39 And, you know, there were other journalists there, but you arrested the two
01:45:39 --> 01:45:41 black journalists who were there.
01:45:42 --> 01:45:48 So no coincidence, right? That's just, is what it is. You arrested the two black
01:45:48 --> 01:45:50 journalists that were there. Thank you.
01:45:51 --> 01:45:53 They must have had something to do with it because the protesters,
01:45:54 --> 01:45:57 the majority of them were black. So they must have had something to do with it, right?
01:45:58 --> 01:46:05 So, you know, meanwhile, in D.C., Donald Trump is,
01:46:05 --> 01:46:12 you know, was attending a grand opening or premiere of his wife's documentary,
01:46:12 --> 01:46:23 which was basically a $40 million bribe from Amazon to the Trumps to stay in their good graces.
01:46:24 --> 01:46:32 It cost $40 million to do a documentary on, I think it's like 10 days before the election.
01:46:36 --> 01:46:46 Anyway, $40 million on a documentary. Anyway, so once they heard that Mr.
01:46:46 --> 01:46:50 Pretty was shot, they didn't think about, well, maybe we can move it down today.
01:46:50 --> 01:46:52 Oh, no, the show must go on.
01:46:52 --> 01:46:57 It didn't happen here in D.C. It happened in Minneapolis, so why should we care?
01:46:57 --> 01:47:01 Now, the NBA stopped an actual basketball game.
01:47:01 --> 01:47:05 That didn't come from the Timberwolves. That didn't come from the visiting team.
01:47:05 --> 01:47:09 That came from the league. The league said, yeah, no, we're not playing a game
01:47:09 --> 01:47:11 tonight in Minneapolis.
01:47:11 --> 01:47:16 The NBA canceled a game, but the first lady of the United States couldn't put
01:47:16 --> 01:47:18 off a premiere for one day.
01:47:19 --> 01:47:22 I mean, those are the kind of people we're dealing with. It's just,
01:47:23 --> 01:47:29 it's to the point now where it's beyond being angry about it,
01:47:29 --> 01:47:34 which I encourage people to still be angry, but it's really ridiculous.
01:47:35 --> 01:47:42 It's, you know, there's anger and then there's frustration added to that.
01:47:42 --> 01:47:47 And the ridiculous part creates the frustration because we don't seem to have
01:47:47 --> 01:47:54 any real leadership, at least leadership that has backbone, right?
01:47:54 --> 01:47:58 You know, and there's some people I don't have high expectations for, like Henry Cuellar.
01:47:58 --> 01:48:04 Or, you know, after this young man was shot, which now there has been two people
01:48:04 --> 01:48:09 who have been killed, there's been three people that's been shot in Minneapolis,
01:48:10 --> 01:48:12 but two people have been killed,
01:48:12 --> 01:48:18 which has basically tripled the homicide rate in Minneapolis for this year.
01:48:19 --> 01:48:25 So after that, they had a vote. Congress decided to actually take a vote on something.
01:48:25 --> 01:48:32 They voted to include a budget package to avoid the shutdown because that was
01:48:32 --> 01:48:36 looming at the end of January. That was looming.
01:48:39 --> 01:48:41 And so as I'm recording this I don't know
01:48:41 --> 01:48:48 if the shutdown has happened or not you you know we'll catch up if it did but
01:48:48 --> 01:48:52 as of right now it looks like it wasn't going to happen and here's why because
01:48:52 --> 01:48:59 the shutdown could have happened in the house but seven democrats including Henry Cuellar,
01:49:00 --> 01:49:05 sided with the Republicans to pass the continuing resolutions.
01:49:06 --> 01:49:10 Now, again, I don't have a whole lot of high hopes for Cuellar.
01:49:10 --> 01:49:15 Cuellar's been like, I don't even know, does he know the difference between
01:49:15 --> 01:49:21 a Democrat and Republican? I think he just runs as a Democrat because the district is Democratic.
01:49:22 --> 01:49:27 But, you know, he got Donald Trump to pardon him and then turn around and told
01:49:27 --> 01:49:28 Donald Trump, I'm not switching party.
01:49:29 --> 01:49:31 So he gets over on everybody.
01:49:32 --> 01:49:37 That's just him. And if the people keep voting for him, you get what you vote for.
01:49:39 --> 01:49:47 But it was one of the seven decided to do a video young lady out of Spokane,
01:49:47 --> 01:49:53 Washington, who basically beat a super MAGA guy to get in Congress.
01:49:53 --> 01:49:59 But she's more or less like the House version of Kyrsten Sinema because she's
01:49:59 --> 01:50:05 voted against student loan relief and, you know, some other progressive things.
01:50:06 --> 01:50:11 And, you know, she tried to explain why she voted for the resolution and it
01:50:11 --> 01:50:15 was just kind of like probably we're better off not even doing the video,
01:50:15 --> 01:50:17 probably just take your lumps.
01:50:18 --> 01:50:20 She's got an opponent in the primary.
01:50:22 --> 01:50:27 So we'll see how that goes. So anyway, it passed the House and now it was over
01:50:27 --> 01:50:33 in the Senate and so, you know, the senators were kind of talking tough and
01:50:33 --> 01:50:36 saying that they weren't going to vote for it.
01:50:36 --> 01:50:40 The Democrats were, as a couple of Republicans said, they weren't going to vote
01:50:40 --> 01:50:44 for the continuing resolutions until something happens at Homeland Security.
01:50:45 --> 01:50:48 You know, got to get these folks out of Minnesota.
01:50:48 --> 01:50:54 You got to, you know, get Kristi Noem out. Just, you know, there was a lot of demands.
01:50:55 --> 01:50:59 But, you know, and then Hakeem Jeffries got up there and said,
01:50:59 --> 01:51:05 well, if she doesn't resign, we're going to go forward with impeachment proceedings,
01:51:06 --> 01:51:07 you know, started drafting it up.
01:51:08 --> 01:51:15 Now, of course, he couched that by saying, you know, once the Democrats get
01:51:15 --> 01:51:19 control of the House, that's going to be the first order of business if she
01:51:19 --> 01:51:21 hasn't resigned by November,
01:51:21 --> 01:51:24 I guess. So...
01:51:26 --> 01:51:29 And supposedly they've worked out some kind of deal where they're going to let
01:51:29 --> 01:51:32 the other continuing resolutions go,
01:51:32 --> 01:51:39 but they're going to set aside the Homeland Security continuing resolution and
01:51:39 --> 01:51:46 change that to like keep it going for two weeks instead of, you know,
01:51:47 --> 01:51:52 voting for the package like they did in the House for the whole fiscal year.
01:51:52 --> 01:51:58 And then give them a couple weeks to discuss all of the demands that Democrats want.
01:51:58 --> 01:52:04 And supposedly they've agreed on that. Supposedly President Trump has given his blessing to that.
01:52:04 --> 01:52:14 And I'm like, it's just gotten to a point now where nobody will take the hard stand.
01:52:15 --> 01:52:18 Nobody will just say, you know what, just shut this down.
01:52:19 --> 01:52:25 I've been trying to get these folks to shut it down from the very beginning, right?
01:52:26 --> 01:52:32 If Chuck Schumer had shut it down the first time, they had the opportunity to shut it down, cool.
01:52:32 --> 01:52:36 Then when you did shut it down, you didn't really have a plan.
01:52:36 --> 01:52:38 You didn't have an exit strategy.
01:52:38 --> 01:52:42 You just said, well, we got to shut it down because we got jumped on for not
01:52:42 --> 01:52:43 shutting it down the first time.
01:52:45 --> 01:52:48 Have an exit strategy when they was cutting off everybody's food stamps,
01:52:48 --> 01:52:51 you didn't know how to counter that, right?
01:52:52 --> 01:52:57 And of course, the government employees being laid off and, you know,
01:52:57 --> 01:53:04 so that ended with no real changes in healthcare subsidies.
01:53:05 --> 01:53:12 People still didn't get the tax break or the subsidy to offset the health care costs.
01:53:13 --> 01:53:14 So now here's the third opportunity.
01:53:15 --> 01:53:21 We've had two people killed by the federal government and the government is still operating.
01:53:24 --> 01:53:27 We're making deals instead of making demands.
01:53:28 --> 01:53:38 I don't get it. I don't get it, but, you know, maybe people smarter than me do, but I don't get it.
01:53:38 --> 01:53:45 And, you know, this is the time where you use all the tools in the rule book to,
01:53:46 --> 01:53:51 down. Whatever parliamentary procedure you have to pull out,
01:53:51 --> 01:53:58 whatever no votes you have to give, whatever you need to do, this is the time.
01:53:59 --> 01:54:03 You know, the president is on a delusion that he's still popular.
01:54:04 --> 01:54:07 He's always going to think that for the rest of his life.
01:54:08 --> 01:54:14 People say that's senility, But he's been that way when he was fully functional,
01:54:14 --> 01:54:17 if he ever was technically fully functional.
01:54:18 --> 01:54:22 But when he was much younger, he had these same delusions of grandeur.
01:54:22 --> 01:54:26 So that's not going to change. It's like his nature.
01:54:27 --> 01:54:31 But the polls are showing that people are pissed, right?
01:54:32 --> 01:54:37 And how they're planning this no king's march in March.
01:54:38 --> 01:54:43 But I don't know how many people are going to be dead by then at the rate they're going, right?
01:54:44 --> 01:54:48 And then we got the flip side.
01:54:52 --> 01:54:59 Where we've got somebody like a Nicki Minaj who is so caught up in her personal
01:54:59 --> 01:55:03 agenda that she's tone deaf to what's happening.
01:55:04 --> 01:55:11 You know, she's trying to get some kind of relief for her husband,
01:55:11 --> 01:55:13 some kind of relief for her brother.
01:55:14 --> 01:55:19 And now she's got her Trump gold card, which means that she paid a million dollars.
01:55:21 --> 01:55:26 Get permanent residency in the United States. Up until this point, she had not done that.
01:55:27 --> 01:55:32 She's paid taxes. She's lived in the country. She's done her thing,
01:55:32 --> 01:55:37 but she had not gone through all the steps to be at least a permanent resident.
01:55:38 --> 01:55:41 And I guess with all the stuff that was going on and some of the things she
01:55:41 --> 01:55:48 had said about Trump in the past, I guess she felt this was her time to get her manumission.
01:55:50 --> 01:55:53 And, you know, she got on Twitter and showed it.
01:55:55 --> 01:55:59 So, you know, it's just a terrible time.
01:56:00 --> 01:56:03 But this is the reason why we don't deify human beings.
01:56:03 --> 01:56:10 This is the reason why we don't glorify human beings too much or that we shouldn't do it.
01:56:11 --> 01:56:15 Because human beings are human beings, which means that if you put them up on
01:56:15 --> 01:56:18 a high pedestal, So they're going to let you down because they're going to be human.
01:56:19 --> 01:56:23 And I think, you know, if you're a fan of the music, great.
01:56:23 --> 01:56:29 But if you're part of the fan club, that's pushing it because fan is short for fanatic.
01:56:31 --> 01:56:37 But it's like, you know, a lot of these people that we are entertained by are not on our side.
01:56:39 --> 01:56:47 And, you know, they might rap about this or, you know, because of our culture,
01:56:47 --> 01:56:49 we embrace them for their athletic prowess.
01:56:50 --> 01:56:56 But a lot of them, once they get to a certain financial status, not on our side.
01:56:56 --> 01:57:01 I just watched Mike Epps' special and he was joking about the fact that when
01:57:01 --> 01:57:05 black folks get some money, they try to move away from black folk, right?
01:57:06 --> 01:57:15 You know, and so, you know, I know there's a lot of people that are disappointed about that.
01:57:16 --> 01:57:23 But as we stated, I am now 61 years old and I've been a black man all my life.
01:57:24 --> 01:57:27 So I'm used to it. I'm not happy about it.
01:57:27 --> 01:57:32 It's like, OK, they got another one of us to do their work.
01:57:32 --> 01:57:37 Right you know this whole thing popped off when they were trying to arrest Don
01:57:37 --> 01:57:44 Lemon Nicki Minaj started going in on Don Lemon and using homophobic slurs and
01:57:44 --> 01:57:48 all this stuff and you know I guess because Don went in on her for,
01:57:49 --> 01:57:56 being on the show with Charlie Kirk's wife right or widow I should say,
01:57:57 --> 01:58:02 and that and that's a bizarre weird thing,
01:58:03 --> 01:58:10 So, you know, we're in an age, you know, we're a long way from an age of heroes.
01:58:10 --> 01:58:17 We're in an age of ridiculousness. It makes sense that a show called Ridiculousness
01:58:17 --> 01:58:22 is the longest running show on American television right now.
01:58:23 --> 01:58:26 Because that's where we are. We're at this point.
01:58:28 --> 01:58:35 And if it wasn't killing us, if it wasn't oppressing us, it'd be funny.
01:58:35 --> 01:58:39 And we try to make light of it so we can get through it.
01:58:39 --> 01:58:45 That's why we value comedians like Jimmy Kimmel and Dave Chappelle and all these other folks.
01:58:45 --> 01:58:57 But this has really gone to a level that if you ain't trying to fix the problem, you are the problem.
01:58:57 --> 01:59:07 If you are not willing to shut it down, if you're not willing to give it your all, you know, then...
01:59:10 --> 01:59:15 But part of the problem, the old Army adage is lead, follow,
01:59:15 --> 01:59:17 or get out of the way, right?
01:59:18 --> 01:59:23 And the people that need to get out of the way are the biggest impediments to
01:59:23 --> 01:59:24 what we're trying to deal with.
01:59:25 --> 01:59:32 Again, like I said, man, this is a crazy time that we live in.
01:59:33 --> 01:59:43 I don't know what else to tell you, man. I just hope that we get some resolve, right?
01:59:43 --> 01:59:48 You know, I know our everyday lives dictate that we do something.
01:59:48 --> 01:59:51 We got to go to work and deal with whatever's going on at work.
01:59:51 --> 01:59:54 We have families and we got to deal with whatever's going on with our family.
01:59:55 --> 02:00:03 But I think the most important sacrifice we need to make is to make time to fight for this nation.
02:00:03 --> 02:00:09 Because anything that's going on in your life right now, if this nation falls,
02:00:10 --> 02:00:11 that's going to change it.
02:00:12 --> 02:00:16 You're worried about what school. In an authoritarian setting,
02:00:16 --> 02:00:18 you may not get that choice anymore.
02:00:19 --> 02:00:23 You know, what kind of job you got, you may not have that choice anymore.
02:00:25 --> 02:00:30 How much money you make, you may not have a choice anymore.
02:00:31 --> 02:00:38 You know, I can imagine because we all human beings and human beings have evolved
02:00:38 --> 02:00:43 to a degree, but behavior wise, not that much.
02:00:44 --> 02:00:48 I'm sure there were folks that were just trying to go about their everyday life
02:00:48 --> 02:00:52 in the middle of the American Revolution 250 years ago.
02:00:52 --> 02:00:57 I imagine. I imagine there were some people that were more concerned about what
02:00:57 --> 02:01:03 they were going to do at their blacksmith shop or candle making shop or whatever,
02:01:03 --> 02:01:08 more so than the politics that was going on, that the revolution that was ongoing,
02:01:09 --> 02:01:14 you know, that was coming forth, you know, or that was already taking place.
02:01:15 --> 02:01:17 Right. Shots had been fired. Right.
02:01:19 --> 02:01:23 People that was trying to cozy up to the British, you know?
02:01:24 --> 02:01:26 You know, it was black folks that looked at the British and say,
02:01:27 --> 02:01:28 well, at least I'll be free.
02:01:29 --> 02:01:34 Not understanding that all the ships that were bringing them in were British ships.
02:01:34 --> 02:01:40 Were their cousins to the Caribbean islands, right?
02:01:42 --> 02:01:47 And, but they figured, well, you know, if we side with the British, we'd be free.
02:01:48 --> 02:01:53 And there were those of us that sided with the colonists thinking,
02:01:53 --> 02:01:59 well, we fight for the freedom of this nation. We might get our own freedom too, right?
02:02:00 --> 02:02:09 So I get it that there's no guarantees that things are going to be better if we get rid of Trump.
02:02:09 --> 02:02:17 But just like the gamble that was taken by Crispus addicts and other blacks
02:02:17 --> 02:02:23 like him, the side with the colonists, I just think the alternative would have been much worse.
02:02:25 --> 02:02:32 Because we won the war when the Constitution was written, there was an effort
02:02:32 --> 02:02:35 on paper to end the slave trade.
02:02:35 --> 02:02:40 There was actually a date put into the Constitution to end the slave trade, right?
02:02:41 --> 02:02:46 Don't know if that would have happened if King George had succeeded.
02:02:46 --> 02:02:50 I mean, eventually it would have ended, right? Maybe.
02:02:51 --> 02:02:57 But it wouldn't have been a civil war to decide it, because we've all been colonies of England.
02:02:58 --> 02:03:05 We had all been part of the British Empire, which was exploiting everybody and everything. Right?
02:03:06 --> 02:03:13 I mean, at some point, we've got to show some resolve and fight for what is right.
02:03:15 --> 02:03:17 We've got to do that, eventually. Right?
02:03:19 --> 02:03:26 If that means that you might not need to go to choir practice or you might not
02:03:26 --> 02:03:33 need to go on their dinner or you might not need to go shopping for not grocery
02:03:33 --> 02:03:34 shopping, but just shopping,
02:03:35 --> 02:03:38 you know, an indulgence, if you will. Right.
02:03:39 --> 02:03:42 Because people have talked about, well, why don't we just have a national strike?
02:03:43 --> 02:03:45 We're too selfish for that to work.
02:03:47 --> 02:03:48 Nobody's willing to make the
02:03:48 --> 02:03:55 sacrifice for that to work but I would I pray that we get to that point.
02:03:57 --> 02:04:02 It doesn't make sense now. It is ridiculous now. It is a joke now.
02:04:03 --> 02:04:11 These folks have succeeded in making something that took 250 years to build almost irrelevant.
02:04:12 --> 02:04:21 And for the few of us that still believe, you know, we just need the rest of y'all to trust us.
02:04:22 --> 02:04:27 Make that sacrifice. Make that sacrifice to give these people hell.
02:04:27 --> 02:04:35 And just, you know, some point we just got to shut this down because I can't
02:04:35 --> 02:04:42 imagine the leadership that we have now doing what needs to be done like Mandela
02:04:42 --> 02:04:46 and them did in South Africa or what we did,
02:04:46 --> 02:04:51 what the world did with the Nuremberg trial, right?
02:04:51 --> 02:04:57 I just can't see us doing that. I can't envision us doing the right thing and
02:04:57 --> 02:05:01 disbarring all of these lawyers doing this evil work.
02:05:01 --> 02:05:09 I can't envision us putting people in jail for sedition for what they've done to this nation.
02:05:09 --> 02:05:16 Right? Because all of this foolishness, all of this ridiculous stuff is basically sedition.
02:05:17 --> 02:05:23 That might sound harsh to people, but if you're trying to destroy the American
02:05:23 --> 02:05:25 government, that's what sedition is.
02:05:26 --> 02:05:33 But we want to just say, oh, it's politics, da-da-da-da-da. Nothing is normal. It is not normal.
02:05:33 --> 02:05:42 So I just want to end that. I hope and pray that we decide when we've had enough,
02:05:42 --> 02:05:45 we can be unified enough to let them know.
02:05:46 --> 02:05:52 And shut it down. Now, whatever comes after that, we'll deal with that.
02:05:54 --> 02:05:58 We took to Gamble 250 years ago to break away from an empire.
02:05:59 --> 02:06:06 There were a lot of bumps on that road because there were some mentalities that we needed to overcome.
02:06:07 --> 02:06:09 But now we're at another point.
02:06:10 --> 02:06:15 We're at another road marker. And we're going to have to make a decision.
02:06:15 --> 02:06:23 You have to overcome your fear, your timidity, and your apathy because this
02:06:23 --> 02:06:27 stuff this ridiculous state that we're in.
02:06:27 --> 02:06:36 Oh, I didn't even mention, they came and took the voting files from the 2020 election in Georgia.
02:06:37 --> 02:06:42 And you got Tulsi Gabbard, who is trying to stay in good graces.
02:06:43 --> 02:06:48 She had no input about what was going on in Venezuela, had no input of what
02:06:48 --> 02:06:53 was going on with Iran or any of these international conflicts.
02:06:53 --> 02:06:54 Or even in the Greenland conversation.
02:06:55 --> 02:07:00 But they sent the Director of National Intelligence to go get voter files from
02:07:00 --> 02:07:05 the county warehouse in Georgia, in Fulcan County, in Atlanta.
02:07:06 --> 02:07:08 All of this stuff, man.
02:07:09 --> 02:07:14 Somebody said maybe they're going to throw in Tulsi in there.
02:07:14 --> 02:07:18 Maybe they're going to try to get Maduro to say that Venezuela had something
02:07:18 --> 02:07:21 to do with the 2020 elections being rigged and Donald Trump losing.
02:07:23 --> 02:07:27 I wouldn't put it past them because we're at a point of everything being ridiculous.
02:07:29 --> 02:07:32 So guys, just y'all chew on that.
02:07:32 --> 02:07:36 And y'all say, well, Fleming, you know, you, you can say all that,
02:07:36 --> 02:07:38 but I got to live my life. I feel.
02:07:39 --> 02:07:45 But when your life is totally shattered, when you had a chance to do something
02:07:45 --> 02:07:49 to stop that, you got to deal with that too.
02:07:50 --> 02:07:53 All right, guys, thank y'all for listening until next. you.


