Trump's plan to win the AI race
Newcomer PodJuly 25, 202500:40:4237.27 MB

Trump's plan to win the AI race

Tech leaders unveiled the American AI Action Plan on Wednesday and President Trump signed 3 executive orders with big handouts to AI companies. Madeline returns fresh off of a trip to Washington DC and gives Tom and Eric the lowdown on tech's victory lap at the nation's capital. Plus, even the altruists at Anthropic feel the need to raise Middle East money.Timecodes:01:45 - The All-In podcast might as well be state-run media05:16 - The vibes at the All-In Hill and Valley event08:50 - The coming AI abundance12:29 - Woke AI and the ministry of truth15:30 - America's strategic advantage is President Trump22:36 - The AI copyright kerfuffle32:36 - Dario faces the harsh reality of capitalism



00:00:00
Hello. Hello, everybody.

00:00:01
It's Tom Doughton here bringing you their latest episode of the

00:00:04
Newcomer podcast, joined by Eric, Newcomer of Newcomer and

00:00:08
Madeline Renbarger of Newcomer. Welcome everybody.

00:00:11
How you guys doing? You love that.

00:00:13
I'll never stop. I asked Tom if you wanted to

00:00:15
change the podcast name. He said no.

00:00:18
It's a brand. It's a big giant end.

00:00:20
Well, I'm just a contractor, so I'm actually a free radical here

00:00:23
You guys are deep in deep in the swamp of newcomer.

00:00:25
Speaking of the swamp, Madeline, you just returned from the

00:00:29
swamp. I was talking to you yesterday.

00:00:31
You were at a defense tech conference, Hill and Valley.

00:00:34
Why is it called Hill and Valley?

00:00:36
So I actually let me clarify, it was Co hosted with the Hill and

00:00:40
Valley forum, but it was actually called winning the AI

00:00:43
race, which was a new conference also Co hosted with the all in

00:00:46
podcast. So I want to clarify for the

00:00:47
brand's sake, a different event, but it's called Hill and Valley

00:00:51
because it's, you know, the merger of, you know, DC, the

00:00:54
Hill and the Silicon Valley coming together.

00:00:58
Silicon Valley. See, I was thinking, no, I'm not

00:01:01
smart in that way. And I actually I was liking my

00:01:04
brain because you know, there are some like Chinese based

00:01:06
private equity firms called like Hill House Capital and stuff and

00:01:10
you never know what that means. I thought this was some weird

00:01:12
permutation of that, but no. This podcast is supported by

00:01:16
Google. Hi folks, Paige Bailey here from

00:01:18
the Google DeepMind Devrel team. For our developers out there, we

00:01:21
know there's a constant trade off between model intelligence,

00:01:24
speed and cost. Gemini 2.5 Flash aims right at

00:01:27
that challenge. It's got the speed you expect

00:01:29
from Flash, but with upgraded reasoning power.

00:01:32
And crucially, we've added controls like setting thinking

00:01:35
budgets so you can decide how much reasoning to apply,

00:01:37
optimizing for latency and costs.

00:01:39
So try out Gemini 2.5 Flash at aistudio.google.com and let us

00:01:43
know what you build. Tell me, tell me so and you just

00:01:46
found out at some point as you were there like, ah shit, I'm at

00:01:49
an all in conference. Yeah, so it was advertised as

00:01:52
being Co hosted by All In. But what I did not realize

00:01:55
getting into it is that the on stage programming was a 5 hour

00:01:58
live all In podcast. Where the All In.

00:02:01
Guys, I didn't even realize that.

00:02:03
Yes, they all in guys were on stage with all of these tech

00:02:06
leaders and Trump administration officials just kind of running

00:02:09
through Trump's new AI policy and reacting to it and chumming

00:02:13
it up and hanging out on a couch.

00:02:15
That was the general vibe. There also was the classic

00:02:18
networking section outside where you could talk with founders and

00:02:21
investors that attended. My God.

00:02:23
So we're gonna have a whole future media segment of this

00:02:26
show, I guess. Man, I didn't realize that piece

00:02:29
of it. Or I guess we have to make like

00:02:31
Cerebral Valley a live podcast, like a 5 hour podcast now.

00:02:35
Like if that's the bar, we. Just want like filibustering but

00:02:38
anyway. Well, it's kind of funny to me

00:02:40
that I mean, you know, not to start off with the all in guys

00:02:44
because really the less about them, the better we say.

00:02:46
But like, you know, they've played a huge part in, you know,

00:02:49
the the Trump, the tech MAGA movement.

00:02:52
And obviously David Sachs is part of the administration now.

00:02:55
But this seems to be like the fullest, like the most fullest

00:02:58
expression of like their Trump ties, right?

00:03:02
This is like the full the first time that they've done like a

00:03:04
conference around the fact that they are like deeply embedded in

00:03:07
the administration. I mean, David Sachs is working

00:03:10
for them at this point. Like they've had conferences.

00:03:13
I don't know. If anything, I feel like some of

00:03:15
the like the sex appeal and freshness of it all is going

00:03:18
away. It's like they've done

00:03:20
conferences. Trump's there while the Epstein

00:03:24
files are dropping. It's like, it feels a little bit

00:03:27
like, I don't know, state TV to me where they're like on the

00:03:31
team setting it up. Did did they bring?

00:03:34
I assume Epstein did not come up at all on stage, right?

00:03:37
You know, Eric Epstein did not come up on stage.

00:03:40
And but their media figures on some level, like what media,

00:03:44
even conservative media figures, can't stop talking about Epstein

00:03:47
because it's such a good story. And so I think the fact that All

00:03:50
In would have a conference and not talk about like the biggest

00:03:54
elephant in the room. It we learned that day that the

00:03:57
attorney general had told Trump that he was in the Epstein files

00:04:00
and they don't bring it up. It's just that is the smoking

00:04:03
gun that all in is in no sense media.

00:04:06
But they're not not just not media.

00:04:08
They are not working on the behalf of their audience, right.

00:04:11
If you're working on the behalf of your audience, forget.

00:04:14
I don't even want to say I'm a journalist anymore.

00:04:15
You know, my whole thing has been like, I'm a creator that

00:04:18
wants to exceed expectations, but at least I know that I am

00:04:21
working for my audience and the readers.

00:04:24
And if they had an obvious question, I had someone up on

00:04:27
stage, I didn't answer it. They should make fun of us, you

00:04:31
know, and all in is is working for their subjects and not for

00:04:36
their listeners, whether their listeners want to acknowledge

00:04:39
that or not. And I think this event was sort

00:04:41
of the capstone of that sort of capture.

00:04:46
Well, I feel like they're listeners and their audience

00:04:49
and, you know, I feel like it's kind of merging into one, right?

00:04:53
Like if this is the culmination of, you know, all in post having

00:04:56
Trump on the podcast, they talked a lot about how, you

00:04:58
know, they saw the light after he was such a great guest on the

00:05:01
show to then, you know, be parts of the administration.

00:05:04
Well, Trump. Trump literally thanked even

00:05:07
Jason. Even as we know, Jason

00:05:10
Calacanis, I say even thank you, Jason.

00:05:16
All right, in in classic fashion, we couldn't hold off on

00:05:19
takes for long. So I've already given you a rant

00:05:21
without even laying the groundwork.

00:05:22
But Madeline, can you give it give us sort of a little bit of

00:05:25
the overview of the day. Yes.

00:05:27
So spearheaded by the all in guys, Trump was obviously the

00:05:30
draw. You did what the quick trip to

00:05:34
DC from New York, train there train back same day.

00:05:36
What was what was on the ground who talked?

00:05:39
Just give us sort of a couple bull points on what it was like.

00:05:42
Sure. It was definitely, I think

00:05:44
compared to past Hill and Valley forums, which I've covered much

00:05:47
more emphasis on government officials.

00:05:49
There were, you know, Chris Power from Hadrian who just

00:05:52
raised a big amount of money, gave a presentation on stage

00:05:55
about how they're going to, you know, build up AI factories.

00:05:57
And Jensen of course, was on stage with the all in guys

00:06:00
talking, you know, thanking President Trump for accelerating

00:06:02
AI and talking about all of the great jobs that AI build up is

00:06:06
going to create. However, most of the people on

00:06:08
stage were predominantly administration officials.

00:06:11
So it felt a lot more like here we are, it's DC, we're coming on

00:06:16
tech friendly media to talk about all the great stuff we are

00:06:18
doing for you, the tech audience.

00:06:20
And this was literally like an announcement, right?

00:06:22
I mean, they literally published winning the race America's AI

00:06:26
Action Plan, like right around this.

00:06:28
It was. The same day, about an hour

00:06:30
before the event kicked off. And Davis X is the AI czar

00:06:34
bringing in his in house media and basically like.

00:06:37
But that's just smart event planning.

00:06:39
That's just called synergistic thinking.

00:06:40
Democrats need to get this shit together.

00:06:42
Have friendly softball questions from in house media you know.

00:06:46
Oh, I just meant like having an announcement, you know, you want

00:06:48
news to break on the day of your conference, you get people

00:06:50
talking. I mean, good, good on them.

00:06:53
That's actually sounds like a very traditional media playbook.

00:06:55
And it seems to me like they're really in the pocket of

00:06:56
traditional media now. Yeah, what I heard Madeline was

00:07:01
saying that all the reporters were passed off to the balcony.

00:07:04
You couldn't even hear the toss A.

00:07:05
Member As a member of the New Media Madeline, how how are you

00:07:08
treated? As a member of the new media, I

00:07:11
felt like I should have been treated a little better since

00:07:13
we're, you know, not the lamestream media over here at

00:07:15
Newcomer I, the press area was a little bit separated off from

00:07:19
the main auditorium, up on a second level.

00:07:21
They didn't really want us coming down to the main level

00:07:23
to, you know, mingle with people unless we were in the lobby, you

00:07:26
know, grabbing coffee and stuff. So I do feel like the the media

00:07:30
was kind of in attendance, but not really participating in the

00:07:33
the same way as past events that I've attended.

00:07:36
You should have you should have like a flat, you know, like the

00:07:38
the press, you know, flak jackets that they get like when

00:07:40
they're embedded people. I feel like you could get a

00:07:42
separate one that says sub stack on it and they just treat you a

00:07:45
little different in these events.

00:07:46
I think I should have gotten special treatment because we're

00:07:48
a sub stack newsroom. I think we are the independent

00:07:51
media that they want to have covering this.

00:07:53
You know, we're, we're breaking the molds.

00:07:55
Sub stack needs to have his own advocacy org, you know, just

00:07:59
like there's like the White House Correspondents

00:08:01
Association. The sub stack should have sort

00:08:03
of a centralized like you can't treat our people like that.

00:08:06
But OK, I know it's Trump and like it's mostly optics and says

00:08:10
a bunch of insane shit. He talked about like maybe we

00:08:13
should break up NVIDIA. And then somebody explained why

00:08:16
that would be a terrible idea given Nvidia's like the heart of

00:08:19
everything we're doing. But so there was lots of insane

00:08:21
Trump shit and we can talk a little bit about like, you know,

00:08:24
his demeanor, but the substance, please, just what what?

00:08:29
Just just the substance, just quickly, what what actually

00:08:32
happened? Like what they wield all the

00:08:34
levers of power, Like all of them.

00:08:37
Like what? What are they doing?

00:08:39
Yeah. I mean, they had on stage the

00:08:41
Vice President and the President of the United States.

00:08:44
What are they doing? Like what is?

00:08:46
What are they? Doing not.

00:08:48
Saying like what is going to happen?

00:08:50
The key thing that is going to happen per the executive orders

00:08:54
is that environmental regulation is going to be pulled back so

00:08:57
that this data center build up will be easier to do.

00:09:01
Fuck the environment. Build AI, get energy ready.

00:09:03
Let's go. Yeah.

00:09:04
It was very build, baby build, you know, instead of drill, baby

00:09:08
drill, there was build, baby build.

00:09:09
That was the vibe inside the room.

00:09:11
Clear regulation, clear cut regulation.

00:09:13
So AI can like dominate and we're going to have this energy

00:09:16
crisis. So we need to light every like

00:09:19
dead dinosaur that we got left on the planet.

00:09:22
Sure, the abundance guys are shedding a tear of joy in the

00:09:25
midst of all of that so we can find.

00:09:26
Some, yeah, it's pretty abundant.

00:09:28
We're going to have AI abundance.

00:09:30
Step 2, the second executive order really touched mostly on

00:09:34
around exporting AI to make sure that American AI can be

00:09:39
exported. So that's, you know, the plug to

00:09:41
Jensen. We're not going to totally clamp

00:09:44
you down with export controls. After all that China fear

00:09:47
mongering, it's like actually, after creating tariffs and

00:09:52
saying isolationism is the path to American success on AI,

00:09:56
somehow they've conned Trump into saying actually America's

00:10:00
AI succeeds if we can sell it everywhere.

00:10:02
So it's embedded even in China. We want NVIDIA chips in China

00:10:05
because otherwise it's going to be Huawei.

00:10:08
We are globalists and internationalists on AI and then

00:10:12
randomly on like manufacturing cars and any industry that

00:10:16
doesn't have a great lobbying firm where isolation is.

00:10:18
It makes zero sense. Oh, that's totally the policy.

00:10:22
Well, I was going to say too, I think what was really just, you

00:10:26
know, the tell of all of this was that during Trump's speech,

00:10:29
you know, he bragged about all of the tariffs, making jokes

00:10:31
like there's too many countries. I have to do deals with all

00:10:34
these countries and, you know, touted his recent Japan deal.

00:10:37
But then of course, with AI, it's like, cut it loose, let it

00:10:40
go because that's, you know, American soft power ruling the

00:10:43
world or else China AI will take over.

00:10:44
So I feel like there were the inconsistencies in the policy

00:10:47
really kind of popped off throughout the day.

00:10:50
Have you just paid a little bit of attention?

00:10:52
I obviously believe in some environmental regulation, but I

00:10:54
think clear cutting regulation to build AI is good.

00:10:59
I think for the if you buy into the like the Super growth

00:11:04
mindset that we need to do this in the China race, it certainly

00:11:07
is good. I do think that the extent that

00:11:09
they want to cut it is definitely, you know, no gloves

00:11:13
at all, just build drill. But we're going to be helping

00:11:16
China now. We're selling them NVIDIA chips.

00:11:18
Like, I don't think it's about like the war with China.

00:11:21
Like I don't buy it at all. But we're also beating them.

00:11:24
Sure. We're beating them.

00:11:26
We're giving them the tools to undercut us and steal.

00:11:30
Yeah, I want to cooperate with China, so I I have no problem

00:11:32
with it. Godspeed.

00:11:33
But the idea that it's going to help us beat China by giving

00:11:36
them the chips that they need to deliver the stuff doesn't make

00:11:40
sense to me. But also the export ban wasn't

00:11:42
working like the the the whole thing end up being they were

00:11:46
getting in there through like Indonesia and like various

00:11:48
Southeast Asian countries like you saw with DeepSeek that they

00:11:51
were plenty capable of building like competent high quality

00:11:55
models largely on chips that had been smuggled into the country.

00:11:57
So. All it was really doing was

00:11:59
hurting American chip manufacturers, which is why

00:12:01
Jensen was kind of pushing them to, you know, hey, don't, don't

00:12:04
lock this in this way. Can I ask quickly because I, I,

00:12:08
I saw the tweet, I didn't dig more into it.

00:12:11
Why did he want to break up NVIDIA?

00:12:12
What was the argument there? Why were they scary to him?

00:12:15
I might be misremembering, but it felt like it was like an

00:12:18
NVIDIA. Who are those guys?

00:12:20
What are they doing? No, they're American.

00:12:22
Asian guy that runs it, that's no good.

00:12:25
Break that thing up, but it's like.

00:12:27
You can't tell me that there wasn't at least an initial

00:12:29
thought by Trump that like they're a Chinese company.

00:12:32
Right. No, literally.

00:12:34
On stage. He more or less implied that in

00:12:36
his little NVIDIA riff he was doing in the speech.

00:12:39
Like, so there there's this AI action plan, but then there were

00:12:43
also executive orders about AI, the most disturbing of which to

00:12:47
me was this order that says if you're a government contractor,

00:12:52
you have to have unbiased AI. And they specifically talk about

00:12:57
DI and transgenderism. And it's just like, yeah, I

00:13:02
mean, the idea of unbiased is incoherent.

00:13:05
And then in the same executive order, you're basically saying

00:13:08
what you can't talk about. So it's like inserting bias.

00:13:12
I, I don't know, I find that deeply troubling.

00:13:14
The last thing I'll say on this, I, you know, one of my more

00:13:17
viral tweets, I said, you know, no atheists in a foxhole, no

00:13:21
libertarians in a bank run when David Sacks was like begging for

00:13:25
SVB to, to get a bailout. And now it's like, you know,

00:13:28
there are no, there are no free marketers in government.

00:13:31
It's like as soon as David Sachs touches the government, they're

00:13:34
putting out executive orders saying let's let's regulate AI

00:13:38
so that you can't explain transgenderism to people.

00:13:42
Well, it's always just the point of, you know, the free speech

00:13:45
for me and not for thee. We saw this with the social

00:13:47
media wars, right? Like this felt like kind of

00:13:49
social media Part 2 where Zuckerberg caved and, you know,

00:13:52
got rid of community notes and Meta and all of this stuff to,

00:13:55
you know, get these get close to Trump and buddy buddy with the

00:13:57
administration. And so by just declaring that

00:14:00
for AI companies to work with the government in any way,

00:14:03
especially with all these DoD contracts on the line, they have

00:14:05
to, you know, remove bias just like that just means that they

00:14:09
have to censor certain topics because the training data is

00:14:11
already in there. Like what do they want?

00:14:12
And of course, which AI company has seemed the most biased so

00:14:16
far? You're talking about the one

00:14:17
that has written into. It's like prompts that like.

00:14:20
What would Elon say on this topic?

00:14:22
Right, the one that literally talks in first person sometimes

00:14:25
as the owner and searches like what's the guy who owns me think

00:14:29
before I answer this. Well, I also love the idea of

00:14:33
like, you know, a contractor having to explain to whatever

00:14:36
the DoD or any other organization that's going to

00:14:38
sign a huge, you know, trying to get like a huge RFP for the

00:14:42
different language models. Like Sir, we have found the most

00:14:45
unbiased of all of the chat bots that are out there.

00:14:48
Side point, it is Mecca Hitler. So you you will have to work

00:14:52
through that. But yes, as we ran through it,

00:14:55
the most unbiased 1 happens to also give specific instructions

00:14:58
on how to attack Will Stancil in the night.

00:15:00
But it is the least woke. So that is the one we're going

00:15:03
to have to choose based on the EO.

00:15:05
Yeah, it is just the hypocrisy of, you know, bias just meaning

00:15:10
different from Trump's positions on issues like that is if we're

00:15:13
just classically going back to that, it's just the greatest

00:15:16
hits of that. Again, we're just putting that

00:15:17
in executive order. And I just, I'm curious to see

00:15:21
how companies respond to it because I mean, working with the

00:15:24
federal government has proven to be very lucrative for a lot of

00:15:27
these companies so far. So I don't know what's gonna

00:15:30
happen. Well, we, we, we also saw like

00:15:32
in Trump one like vendettas being used as a way to hand out

00:15:36
contracts or stop other companies from getting contracts

00:15:38
was like fairly standard procedure.

00:15:40
Like Amazon AWS lost their shit after the Pentagon gave a

00:15:44
contract to Azure and to Google over AWS because of, you know,

00:15:48
Trump's anger at Bezos and, and Amazon and the Washington Post.

00:15:53
And so, yeah, no, I'm just saying like there's a real

00:15:55
reason for these people to like, play up at, you know, Curry

00:15:58
favor and like be as sycophantic as possible to Trump because.

00:16:01
Oh my God, yeah. Didn't Jackson literally say

00:16:03
America's going to win an AI because of Trump or something?

00:16:06
I America's unique advantage that no country possibly have is

00:16:12
President Trump. There there are a lot of

00:16:16
compromises I wouldn't want to make as a public company CEO.

00:16:19
Like, I don't know, but like the Oh yeah, Trump, you're a genius.

00:16:23
Like everything through you. It's like it's such a low

00:16:26
hanging way to Curry favor that I can see why it's so tempting.

00:16:32
And Jensen smartly, you know, got his export controls

00:16:35
overturned, so it's a good play. His approval ratings and Tom,

00:16:37
you're making this point before the show.

00:16:39
They are bad, like, somehow, like, until we have the

00:16:42
midterms, I feel like the culture, you know, if this were

00:16:44
like a normal president with approval ratings this bad,

00:16:47
people would, like, start to not taking him seriously.

00:16:50
But I think it's because the Republicans would get primaried

00:16:53
anyway that they have to be so locked up.

00:16:55
I don't know. Yeah.

00:16:57
Tom, what's your observation on the.

00:16:58
I think. We've talked about this a couple

00:17:01
times on the show, like, you know, the bald faced sycophancy

00:17:04
of the tech, you know MAGA, right?

00:17:06
Whatever you want to call it toward Trump at one point seemed

00:17:09
fairly prescient, right? If you were, I don't want to

00:17:12
talk about Sean McGuire again, But if you were one of those

00:17:13
people that were backing Trump during the 2024 election, you

00:17:18
looked really smart, like you looked way ahead.

00:17:20
Well, I would say to this point, Tom, the all the Hill and Valley

00:17:23
Forum leaders who, you know, put on this summit in the 1st place

00:17:26
with Jacob Helberg, you know, kind of spearheading this with

00:17:29
Delia and and Christian Garrett, they have, you know, hold the

00:17:32
anchors of powers in some way. And you know Helberg is going to

00:17:35
be appointed in the administration.

00:17:37
Republicans, if you're willing to be the one smart person

00:17:40
around a Republican, you could like, fucking ascend to the top

00:17:43
immediately. It's like it's a vacuum.

00:17:44
It's the easiest hack. It's like so many fucking

00:17:47
lawyers do that. But that's under the assumption

00:17:50
that being smart helps you as a Republican, which I don't really

00:17:52
think it does. I mean, but they always want.

00:17:54
I don't think to dominate. I mean, you see.

00:17:56
Yeah, you can. I think they can tell you the

00:18:00
story that being quote UN quote smart doesn't necessarily make

00:18:02
you the like the popular guy or the most powerful person in the

00:18:05
party. But you know, like Sam Altman is

00:18:07
an interesting case with that because he lifelong Democrat,

00:18:11
you know, is as public about hating Trump as almost anyone in

00:18:14
tech after he gets elected. I mean, he makes as a source

00:18:16
told me like not too long ago, he made a determination like

00:18:19
midway through the campaign, 24 Trump's going to win.

00:18:22
I need to start getting closer to him and he starts doing that

00:18:25
and then he streets out, you know, Oh, I misunderstood you,

00:18:27
Trump. Everything I thought, you know,

00:18:29
was, you know, intermediated by a biased media and all this

00:18:33
bullshit that everyone has to say in order to make themselves

00:18:35
feel, you know, close to Trump. So but that all seemed very

00:18:38
smart. And now, like, I was talking

00:18:39
about 37% approval ratings as low as it's ever been for him.

00:18:43
It's not as low as it got during Trump won, but like, he is not a

00:18:46
popular president. And and, and so when you see,

00:18:48
and it's not, I don't know if it's going to get better or not,

00:18:50
but it's unlikely to turn around in a major way.

00:18:53
And so when you see the same rhetoric from the tech crowd

00:18:57
supporting him, like Jensen saying, you know, we couldn't

00:18:59
have done without you, Trump, the big advantage that we have

00:19:01
is that you, we have Trump and no one else does.

00:19:04
It's so blatant as to what's going on here.

00:19:06
You're not tapping into a zeitgeist.

00:19:08
You're just trying to get the best possible deal for your

00:19:09
company. You're trying to line your

00:19:11
pockets. And that's fine.

00:19:14
That's business. It's it's gone like that.

00:19:15
Forever they want to get rich, they want the companies to do

00:19:17
well and that that's their job. But I, I do think they also want

00:19:20
to see, they think AI is like a bigger force than honestly, like

00:19:24
the, the US government and like the US government has a lot of

00:19:27
power. But it's like we need to suck up

00:19:29
the Trump and he's going to like clear cut the government and say

00:19:33
we can do whatever we want to power the data centers.

00:19:35
We need to build God like that's that's what they want.

00:19:40
And Eric, to your point, I, I think that is what they want.

00:19:42
And I feel like they've really got it.

00:19:44
Like, you know, I was, you know, commenting like all this point

00:19:47
about, you know, Trump being low energy and like, you know, the

00:19:49
Epstein story breaking while he's on stage, Like none of that

00:19:52
seems to really matter. They got the order, you know,

00:19:55
like it's, it's been the the path has been cleared.

00:19:57
It will matter electorally, but like they're willing to just

00:20:00
like, well, yeah, we won't talk about any of that.

00:20:02
Just like we'll we'll say you're awesome.

00:20:04
Like give us our clear cutting and like.

00:20:06
Let's go forward. Right.

00:20:08
The action plan is already out. The orders are already signed

00:20:10
like you know they're good to go.

00:20:12
Yeah, and they don't really need a Congress to support any of

00:20:15
these things anyway, so they might as well build as quickly

00:20:17
as they can until Trump, you know, completely loses his

00:20:19
ability to do any of the things that they want.

00:20:22
Because I I guess you can always sign E OS.

00:20:25
No one's really going to stop you from that.

00:20:27
And I don't even know how. I mean, is it clear actually

00:20:29
does you're talking to people there like whether or not these

00:20:32
executive orders were going to do all that much.

00:20:34
I mean, there's a lot of posturing with these things.

00:20:37
You're not actually changing that many rules.

00:20:39
They're not signing things into law.

00:20:40
You're not getting rid of active restrictions.

00:20:42
I mean, you know, this, we're poking around like a theme of

00:20:45
this administration is like, who cares what appropriators said

00:20:50
this money was for? Like, we'll do whatever we want.

00:20:52
So there is a degree to which, if the administration is willing

00:20:56
to say they want something and they're willing to bend all the

00:20:59
rules about using money appropriated by Congress however

00:21:02
they want, this is a time where executive orders are more

00:21:06
powerful than ever if they can navigate the bureaucracy.

00:21:10
What stood out to you among the EOS that you think is going to

00:21:13
be the most important? Because I have some thoughts,

00:21:15
but I'm curious as you were there, like what what mattered

00:21:18
to you? What stood out to me, there's

00:21:19
something that stood out from, you know, us as reporters

00:21:22
standpoint that could shape the form of AI, which I do think

00:21:25
the, you know, removing government contracts, unless

00:21:28
your AI isn't woke or whatever point will have impacts on how

00:21:33
this AI forms in the future. But I think from the business

00:21:36
standpoint, you know, streamlining regulations and

00:21:39
gutting climate rules to, you know, do this massive build up I

00:21:41
think will be the most consequential for the AI

00:21:43
industry. What do you think, Tom?

00:21:46
Yeah, I sorry, just back on like the, you know, government

00:21:49
contracts for AI. We we've seen a bunch of the big

00:21:52
companies, the the big labs and model makers sign, you know,

00:21:55
multi $100 million deals with the government recently.

00:21:58
Like I saw XAI got a big contract, which is interesting.

00:22:02
Open AI got one as well. So we're like at the beginning

00:22:05
stages of like actual tangible contracts between these

00:22:09
companies and and the government.

00:22:10
I do want to say as a point, it does often get misreported that

00:22:14
like opening, I signed a $200 million contract with the DoD.

00:22:17
That's not quite right. They signed like a $5

00:22:20
like basically exploratory contract with the potential of

00:22:23
it getting up to $200 million. So people should not report it

00:22:27
as like $200 million is going directly into open AI as like

00:22:30
some sort of government enterprise contract deal.

00:22:32
It's a little different than these things typically get

00:22:34
reported. But the one part that stuck out

00:22:37
to me, and again, I don't know how much this is going to matter

00:22:40
in the large scheme of things is the fact that, you know, and I

00:22:43
think Trump mentioned this in his speech, that AI model should

00:22:46
be given clearance to train on copyrighted materials.

00:22:51
And it's a huge issue right now that's playing out really

00:22:55
literally in the courts over, you know, the when all of these

00:22:59
models were built, you know, chat GPTS models, GPT 4,

00:23:02
whatever anthropics models, they basically ingested the entire

00:23:06
web and, you know, published materials in order to train on

00:23:09
these things. And as the New York Times is

00:23:11
currently suing right now, they New York Times, because it was

00:23:14
trained on their content, should be compensated for the

00:23:18
information that they created that was later used to create

00:23:21
these models, which are obviously now hyper competitive

00:23:24
with traffic to all of these new sites.

00:23:27
And I also saw there was, was it like a gal or was it like a Pew

00:23:29
poll or a Pew study today that showed that traffic has declined

00:23:33
substantially to a lot of new sites based on the AI overviews

00:23:38
that Google does. And so kind of like the doomsday

00:23:41
scenario that every news organization was thinking about

00:23:44
with the rise of AI and like summarization of articles may be

00:23:48
upon us because people are more willing to read an AI summary of

00:23:52
copyrighted material then then go to the article.

00:23:56
But I got super into this topic, like over the last couple days

00:23:59
and was talking to people about it.

00:24:01
Because if you talk to, you know, journalists or certainly

00:24:05
the lawyers who are representing The New York Times, they believe

00:24:07
like a grave injustice is being it's been done here right now

00:24:11
that that the fact that, you know, these models were trained

00:24:14
on New York Times copyrighted materials, like fucked

00:24:16
everything up for everyone. And they deserve money for it.

00:24:19
And so far, the most of these lawsuits have lost like Mehta

00:24:23
recently just won, you know, a case against this.

00:24:26
And it's a really fascinating conversation and like discussion

00:24:29
about what does it mean to consume and create material and,

00:24:35
you know, the, the, the, the history of the copyright.

00:24:38
And, you know, like when Google created its algorithm for its

00:24:41
search engine, it was also ingesting the entirety of the

00:24:44
web, creating cached pages of, of, of the Internet and creating

00:24:48
some sort of content. If you want to call a search

00:24:50
result content on top of that, that had specific effects on the

00:24:54
websites. And we are seeing now basically

00:24:58
again, this doomsday scenario play out where it is competitive

00:25:01
to a degree with the original content, whether or not they

00:25:06
were legally allowed to do it. And I mean, if we want to get

00:25:09
into a debate about it, I'm down to but like my stance as I spend

00:25:11
more time and it is like, yeah, I do believe they are legally

00:25:14
allowed to do it. And it and it comes down to this

00:25:17
question. And I think Trump kind of

00:25:18
touched on it, you know? Well, yeah, I was going to say

00:25:22
it. Can.

00:25:22
Can I say it? Yeah.

00:25:23
And when he was on stage, Trump kind of did a little when he

00:25:26
went off script, he did a little riff about this.

00:25:29
You can't be expected to have a successful AI program when every

00:25:32
single article, book, or anything else that you've read

00:25:35
or studied you're supposed to pay for, Gee, I read a book, I'm

00:25:39
supposed to pay somebody. And you know, we, we appreciate

00:25:42
that, but you just can't do it because it's not doable.

00:25:46
And if you're going to try and do that, you're not going to

00:25:48
have a successful program. I think most of the people in

00:25:50
the room know what I mean. When a person reads a book or an

00:25:54
article, you've gained great knowledge.

00:25:57
That does not mean that you're violating copyright laws or have

00:26:01
to make deals with every content provider.

00:26:05
And that's a big thing that you're working on right now.

00:26:08
I know, but you just can't do it.

00:26:09
China's not doing it. And if you're going to be

00:26:11
beating China, and right now we're leading China very

00:26:14
substantially and AI very, very substantially.

00:26:17
And nobody's seen the the amount of work that's going to be

00:26:20
bursting upon the scene. But you have to be able to play

00:26:25
by the same set of rules. So when you have something, when

00:26:29
you read something and when it goes into this vast intelligence

00:26:35
machine, we'll call it, you cannot expect to every time,

00:26:40
every single time say, oh, let's pay this one that much.

00:26:43
Let's pay this one just doesn't work that way.

00:26:45
Of course you can't copy or plagiarize an article, but if

00:26:49
you read an article and learn from it, we have to allow AI to

00:26:53
use that pool of knowledge without going through the

00:26:55
complexity of contract negotiations, of which there

00:26:58
would be thousands for every time we use AI.

00:27:03
He kind of touched on the point, you know, himself in a way that

00:27:06
I felt like was honestly kind of hard to go up against.

00:27:11
How do paywalls play into this? Like is there?

00:27:15
Is there just how I don't actually know the particulars?

00:27:18
Like how much is their argument that they're the information was

00:27:21
behind a paywall? I think if you know this gets

00:27:24
into like the robot dot TXT construction of websites which

00:27:29
basically allows the robots to scan the pages.

00:27:33
I, I'm pretty, I'm pretty sure stuff that's behind a paywall is

00:27:36
still probably scannable by by the robots and so that stuff

00:27:40
does like. The New York Times had set it up

00:27:43
so that it was scannable because they wanted search results.

00:27:45
Yeah, yeah, yeah. The same way that it can get

00:27:48
scanned by, you know, the web crawler also allowed these

00:27:52
models to be able to ingest the content on these pages.

00:27:54
The way someone framed it to me, which I was really interested

00:27:57
in, was like, the core of the question is really, do robots

00:28:00
have the right to read? Like, are they allowed to go to

00:28:03
these websites and read and learn things?

00:28:05
And is the thing that they created by reading it

00:28:07
competitive with it? Is it transformative or is it

00:28:09
derivative? If it's derivative, it's

00:28:11
competitive and it's in a violation of copyright.

00:28:13
If it's transformative, it's something new.

00:28:16
And I mean, these are the questions that are going to be

00:28:17
playing out probably in front of the Supreme Court.

00:28:20
But I should be allowed to say who can read read my stuff

00:28:24
right? And like if I have a paywall.

00:28:26
Would you be fine with your like newcomers stories not showing up

00:28:30
in a Google search because you don't want it to?

00:28:32
But why can't I draw a distinction and say I I'm OK

00:28:35
with Google and I'm not OK with you And like you're allowed to

00:28:37
do business with whoever you want.

00:28:39
Like I'm OK with businesses where they're they're trying to

00:28:42
link to me and get people to come to me and not OK with

00:28:45
businesses where they're trying to replace me and summarize me.

00:28:49
Like I don't know where to free market.

00:28:52
I can do business with who I want.

00:28:54
So so to me, I would sort of push more on this sort of like,

00:29:00
you know, should open AI have paid?

00:29:01
Yeah, paid to access the stories.

00:29:07
Well, people, you know, Anthropic did it in their own

00:29:09
way. You know, they bought literally

00:29:11
bought physical books and scanned them and then destroyed

00:29:14
the original material, which that seems like the nature of

00:29:18
the. Book, I mean a book, it's like,

00:29:19
yeah, you get it. You can do whatever you want.

00:29:21
You can, you know, consume it. Physical media you you have the

00:29:25
right to consume it once you own it.

00:29:27
I don't know how it's going to play out.

00:29:28
I mean, again, like every single lawsuit has has lost or or been

00:29:31
thrown out thus far. I think the New York Times

00:29:34
probably has the strongest 1 going right now just because

00:29:38
they're the New York Times. They probably have the best

00:29:39
lawyers to advocate for it. But I imagine this is going to

00:29:43
reach the Supreme Court. It like hurts my heart.

00:29:46
It's like a hard, like, on the one hand, you know, if AI is

00:29:51
going to succeed and AI is like clearly important, our models

00:29:54
need to be able to like scroll, you know, consume all this

00:29:58
information and like they're going to be of national

00:30:01
importance. And like China is clearly doing

00:30:03
the same. So just on a pragmatic level,

00:30:05
like clearly we need them to basically consume everything and

00:30:10
higher quality information. But I'm certainly sympathetic to

00:30:14
people who created all the valuable information saying, I,

00:30:18
I created this world, historically important thing.

00:30:22
Shouldn't I get money out of it? I do.

00:30:24
I do think a lot of the like the value in these AAI models, the

00:30:29
techniques, everybody's been able to copy the technique and

00:30:31
people who came up with the technique or some random

00:30:34
researcher, you know, a lot of the value in the models is the

00:30:37
actual underlying knowledge that's been accumulated by, you

00:30:42
know, a bunch of. Different companies.

00:30:43
My worry about this though is is that if it does get to a point

00:30:46
where you basically just have to compensate all the creators of

00:30:49
content to in order to use their material to create a model,

00:30:54
there are companies that can pay for it.

00:30:56
They're just giant tech companies and we're leading

00:30:59
towards a world where the only people that would be able to

00:31:02
build models are ones that have the 10s of billions of dollars

00:31:05
that you would maybe need to to pay off the creators to do it.

00:31:08
And that's fine. We're kind of already there

00:31:10
anyway, but it destroys the possibility of ever there, you

00:31:13
know, of open source models and, and I think obviously they're

00:31:17
not very big right now the open source community compared to the

00:31:20
proprietary's. But if you believe in that, and

00:31:22
I think a lot of people in the Internet world do have that as

00:31:25
their core belief that open source matters, this seems to

00:31:28
set an extremely high bar monetarily to build these sort

00:31:32
of things. And, and I think that's a bad

00:31:33
thing. Just to add on to that, in the

00:31:35
open AI action plan, you know that they were touting all day

00:31:38
yesterday, there was a provision around protecting open source

00:31:43
and a lot of founders that I spoke to said that that was a

00:31:46
really important win for little tech for them so that they could

00:31:49
continue operating. So.

00:31:50
Oh yeah, that that's a key thing.

00:31:52
I mean, honestly, in some of what Trump is doing is what he's

00:31:55
not doing, you know, it's like the Biden administration trying

00:31:58
to come up with rules and savior this a model this size we're

00:32:01
going to attract you. And now they're being expressly

00:32:04
pro open source. You know, there there was much

00:32:07
more it felt like in the bite administration, the anthropics

00:32:11
and opening eyes of the world. We're trying to sidle up to the

00:32:15
administration and say you can trust us part of the deep state

00:32:18
liberal machine. Don't trust the sort of Wild

00:32:21
West open source models. And clearly under Trump, it's

00:32:24
like, OK, open source is sort of that capitalistic, let everybody

00:32:29
compete spirit. And so that that's another part

00:32:33
of what they announced. Yeah.

00:32:35
So, so Eric, earlier you mentioned Dario's Modi, CEO, Co

00:32:38
founder of of Anthropic, also Cerebral Valley guest at one

00:32:43
point, all these things are and more on his LinkedIn, he

00:32:47
recently sent an internal memo to the staff at Anthropic saying

00:32:51
that they are planning to or in discussions to take some money

00:32:54
from the Middle East. Where was it specifically?

00:32:57
Was it was it Qatar or? Qatar definitely is one of the

00:33:00
major sources. I forget if you called out a

00:33:02
specific country. Yeah, yeah.

00:33:04
And, you know, it seemed like he was a little sheepish about the

00:33:07
whole thing, or at least defensively explaining it,

00:33:09
saying, listen, we did not intend to take money from the

00:33:12
middle. East, he literally says like,

00:33:13
you can't just have good people benefit from your success if you

00:33:17
want to be a big company. Yeah, yeah.

00:33:20
He's teaching us about the way the world works.

00:33:21
He basically gives us the Ned Betty speech in in from network

00:33:25
about how the world works and how you need to kind of accept,

00:33:28
you know, the, the, the primordial powers of of the

00:33:30
universe. It is like capitalism sort of at

00:33:33
its essence. You know, I like capitalism, but

00:33:36
there is a certain like, oh man, like our job is to do this.

00:33:40
You look at the trade off. You know, we're gonna we need to

00:33:43
raise money. Money is money.

00:33:44
Like I, I'm. I mean, this is just one of many

00:33:48
compromises that Dario has had to make as the CEO of Anthropic.

00:33:52
And if you remember the history of this company, he broke off

00:33:55
from Open AI because he was pissed off about taking money

00:33:59
from Microsoft and it kind of polluting this effective

00:34:01
altruist mission of, of the original company and, and the

00:34:05
nonprofit. And then he goes off and lo and

00:34:08
behold, there's a shit load of money that can be made.

00:34:10
And also it cost a lot of money to make these models.

00:34:12
And, you know, he goes off and raises as much as almost any

00:34:15
other company other than Open AI and now realizes that, like,

00:34:19
when you need to keep raising this level of money, there's

00:34:21
really only one place you can do it, and that's the Middle East.

00:34:24
And so the guy is like pretty experienced at compromises.

00:34:27
And it's funny that he's even like, trying to, you know,

00:34:30
explain it away when it's like. I mean, I don't have any

00:34:33
objection to him taking money in the Middle East.

00:34:35
I think the sort of capitulation is interesting to interrogate.

00:34:40
To me, it's interesting right now, just like thinking about

00:34:44
Trump's executive order, right? If we are counting on companies

00:34:49
to have backbone and say, I guess we won't sell the

00:34:51
government, we're going to have to sell out our values on, you

00:34:55
know, thinking about what the results the model produces.

00:35:00
This is not a sign that the the companies will keep their

00:35:02
backbone, right? They're like, well, our job is

00:35:05
to chase the almighty dollar. And yeah, the government is an

00:35:08
important contractor and this is a small capitulation about a

00:35:12
topic we don't that is not central the model like whatever

00:35:15
the money sort of swamps the principle.

00:35:18
And so I, I do think, you know, it's the classic like you give

00:35:21
up one principle and slowly you find yourself giving them all

00:35:26
up. And I don't think this is

00:35:28
necessarily the line that anthropic needs to hold on to.

00:35:32
And I still generally trust them as a moral decision maker.

00:35:37
But you know it. It's small capitulations on the

00:35:41
way to to giving up your morality.

00:35:44
It feels like that was kind of the long arc of open AI, right?

00:35:47
Like, you know, as the an AI that is benevolent to mankind.

00:35:51
And then, you know, there were moral conflicts between this,

00:35:53
but. Open AI was always run by one of

00:35:56
the most power hungry men alive, so I don't know that that's less

00:36:00
surprise. I they yes, they became they

00:36:03
started a non profit. But you know, Sam was always

00:36:05
extremely ambitious. That's that's sort of the that's

00:36:08
line of open AI. That's very fair.

00:36:10
That's a good point. But all this just to say, I'm,

00:36:13
I'm saying even the effect of altruist AI company goes the way

00:36:18
of capitalist enterprise, so. Yeah.

00:36:22
And listen, like if there are going to be paying customers

00:36:24
that want to buy the ability to make biological weapons with

00:36:27
your AI, do you want to win or not like?

00:36:31
They probably pay really well. Yeah, like it's and that always

00:36:35
is the thing. By the way, I've noticed that

00:36:36
there's like fears about the usage of of of large language

00:36:39
models to make bio weapons. And, you know, to date, Dario

00:36:43
remains against using this technology to do that.

00:36:46
But listen, we make one compromise.

00:36:49
You make a lot of compromises. Well.

00:36:50
I was going to say there's always that fear on the bio

00:36:52
weapons, but the the qualms around other weapons and, you

00:36:55
know, partnering with the Defense Department, those are

00:36:57
gone. So yeah, it's just the bio

00:36:59
weapons. I'm interested in like, you

00:37:02
know, the value of a corporate morals and and strategy as to

00:37:09
you know, like, is it about recruitment?

00:37:10
Is it about winning over business or is it about just

00:37:12
sleeping at night? Well, Anthropic has always had

00:37:16
this position that on AI safety they would create a race to the

00:37:20
top. And you could argue that there's

00:37:23
some truth to this. Like the idea was we'll have

00:37:26
scruples and therefore engineers with scruples will come here and

00:37:31
that will create pressure on other companies to, to have

00:37:35
scruples themselves. And I, I do think we've seen way

00:37:38
fewer defections from Anthropic than we've seen at Open AI.

00:37:42
On the other hand, you know, I don't think, Mike, I don't think

00:37:46
Meta is shouting about its scruples.

00:37:48
It's shouting about the power of the American dollar.

00:37:50
And so I don't know that Anthropic's race to the top is

00:37:54
necessarily trickling out to the rest of the AI industry.

00:37:58
But I do think it's had this sort of view that having sort of

00:38:02
principles about we're really going to test these models,

00:38:04
we're going to have robust safety, would help it win

00:38:08
because it would attract, you know, these AI engineers are

00:38:11
getting paid more money than they could ever use.

00:38:13
So to some degree, they are the type of people that will decide

00:38:17
where they work based on other things besides just money.

00:38:20
And you know, having sort of principles is one of them.

00:38:24
I mean, that's the missionary versus mercenary, uh, breakdown.

00:38:27
And it was funny for Sam to be the one that kind of put that

00:38:31
into, you know, the conversation and into the discourse.

00:38:34
I mean, I he's a missionary, but it feels so ambitious, you know,

00:38:38
it's like, it's like I want to be the God king of, you know,

00:38:40
AI. Well, if you're if you're a

00:38:42
missionary surrounded by mercenaries too, like you can,

00:38:45
you know, I don't know how much longer that's going to last for

00:38:48
you if it's really ultimately A mercenary world about attracting

00:38:52
talent. And I think everyone does really

00:38:54
have a number, whatever. I don't want to go deep into

00:38:56
like the talent wars again, but. I mean, you've talked to some of

00:38:59
these people. Yeah.

00:39:00
Well, I think they they are they everyone has a number.

00:39:06
Yeah, I don't understand the psychology of the person who's

00:39:08
like, I make $10 million in four years, Like now I need to make

00:39:14
75 or whatever. I guess I Yeah, that that person

00:39:18
is distant from me. What?

00:39:19
What is the thing that they want?

00:39:21
Well, I think it's just the human impulse of like, I make

00:39:23
$10 million and this guy who's as talented as I am, or maybe

00:39:27
even less talented I am, is making 75.

00:39:29
Like, fuck you. Like, why do you get a third

00:39:31
house? Like that just doesn't that

00:39:33
doesn't sit well with me. So I'm going to like if that

00:39:36
that fucking money's out there, I would be stupid not to take.

00:39:39
I think that's a much more powerful.

00:39:40
Argument and, you know, you start looking at all the

00:39:42
different like marble countertops that you could get

00:39:45
and you're like, well, you know, it adds up, you know.

00:39:48
There's always a place to spend the money.

00:39:49
That's never the issue. All right?

00:39:52
Anyway, good luck, Dario, with that Middle East money.

00:39:55
We'll, we'll, we'll be watching. For it, they still invest, you

00:39:58
know, like they they read this memo where they're like, yeah,

00:40:01
it's distasteful, but we'll do it.

00:40:04
Right, we're about to take. Money from some of the most

00:40:06
disgusting. People in the world.

00:40:08
This is going to be the worst of both worlds.

00:40:10
You know, they they make the moral capitulation, then they

00:40:13
don't get the money because they stuck their nose up at it.

00:40:16
Yeah, well, capitalism runs both ways too.

00:40:18
Like they they want to invest in these things because they want

00:40:20
to make money as well. So you're right.

00:40:22
Yeah. Like they're not just.

00:40:25
All. Right, that that about does it

00:40:27
for us this week. Thanks so much for listening and

00:40:30
we will catch you back here next week.

00:40:33
Subscribe to newcomer.co and we'll see you next week.