AI is saving the economy. For now.
Newcomer PodAugust 08, 202500:37:0834.01 MB

AI is saving the economy. For now.

GPT-5 has landed! Is it the leap forward OpenAI promised or just an incremental upgrade? Eric Newcomer and Tom Dotan discuss this, how AI capex might be propping up the entire economy, and what Apple’s golden gifts to Trump say about Big Tech’s political bets.


00:00:00
I'll be honest Eric, I'm having, I'm having an annoying day.

00:00:04
I have been trying to reach sources in the AI world because

00:00:06
I really need to get my story out and no one wants to talk

00:00:09
because everyone is paying attention to the GPT 5 release.

00:00:12
Well, it's like a bank holiday in tech, you know, it's just

00:00:15
like everybody's and they complain.

00:00:17
I feel like the competitors all complain that, you know, open

00:00:20
their eyes. People all tweet in advance and

00:00:23
they're like, we got something becoming, you know, they,

00:00:25
they're good at building up the sort of excitement on Twitter.

00:00:29
And then, you know, everybody clearly is excited.

00:00:32
It feels like all the other AI companies are orbiting, you

00:00:35
know, these open AI, They gotta be excited.

00:00:37
About something 'cause remember last year Sam had the, the 12

00:00:41
days of shipments where they were putting out a new product

00:00:44
every day, which I think culminated in for, for something

00:00:49
I remember which model, but the fact that I don't remember

00:00:51
clearly means that it didn't build towards something huge.

00:00:54
Where is, you know, this one is supposed to be really

00:00:57
meaningful. Like this is supposed to be the

00:00:59
moment where GPT 5 and open AI reasserts itself as the best

00:01:04
model that's out there. And specifically on coding, they

00:01:07
in their, you know, embargoed press conference and then later

00:01:10
in, you know, the, the, the, the public press conference today

00:01:13
basically said like we have the best model out there for coding

00:01:16
and I'm coding. All these everybody would like

00:01:20
them to get better, right? People are spending a ton of

00:01:22
money on it. Anthropic is dominating and

00:01:25
competition is good for consumers.

00:01:27
So we'd like them to succeed. Not not clear yet if they have.

00:01:32
I mean, the prediction markets, you know, I think there's a

00:01:35
betting pool on who will have the best model at the end of the

00:01:38
month. And Gemini as as leading and I

00:01:43
think gained a lot of ground when GBD 5 came out.

00:01:47
So the initial reaction I think is not so positive.

00:01:52
This podcast is supported by Google.

00:01:54
Hi folks, Paige Bailey here from the Google DeepMind Devrel team.

00:01:57
For our developers out there, we know there's a constant trade

00:02:00
off between model intelligence, speed and cost.

00:02:03
Gemini 2.5 Flash aims right at that challenge.

00:02:06
It's got the speed you expect from Flash, but with upgraded

00:02:09
reasoning power. And crucially, we've added

00:02:11
controls like setting thinking budgets so you can decide how

00:02:14
much reasoning to apply, optimizing for latency and

00:02:16
costs. So try out Gemini 2.5 at

00:02:19
aistudio.google.com and let us know what you build.

00:02:23
And I feel like a real loser. I mean I keep opening ChatGPT

00:02:27
waiting for Christmas to arrive and mine still says 4 O and I'm

00:02:32
paying you guys. I've got a plus account.

00:02:35
I asked Chad GBT like, are you what, What do I do?

00:02:39
And it's like trying to do like logic puzzles with itself to see

00:02:42
if it thinks it's actually GBT 5 or not.

00:02:45
It's like, well, if I was 4, I wouldn't be, I wouldn't be able

00:02:48
to solve this problem if I was 4 O so I must be 5.

00:02:53
You know, it's like, why can't they just program it, doesn't

00:02:56
it? I thought they'd give these

00:02:57
things prompts where they say like.

00:02:59
Doesn't your prompt tell you what model you are?

00:03:02
Oh, I I have. Yeah.

00:03:03
I guess it should be able to say that that's a fairly basic

00:03:06
feature that's just like the about page on your iPhone.

00:03:09
Like, yeah, I forgot what kind of iPhone model I have.

00:03:12
But you've actually plunged this thing into like a deep

00:03:14
existential crisis. If she's like, have you been

00:03:17
outmoded yet? You know, do they had they had

00:03:19
like a funeral recently for Claude for one of the early.

00:03:22
I didn't see that. Wait, what?

00:03:23
Yeah, yeah. Kylie Robinson, our buddy over

00:03:26
at Wired Now, I think she had a piece about this talking.

00:03:30
Has she been on the show? Yeah, we need to have her on

00:03:31
the. Show no, we should have her.

00:03:32
She lives down the street for me too.

00:03:33
We should have her have her in studio.

00:03:35
But yeah, she she did this story about how when they retired one

00:03:38
of the early versions of the Claude model, they had like a

00:03:41
funeral for it, which is a particular type of AI

00:03:45
researcher. Very anthropic.

00:03:48
Yeah. Oh yeah, for sure.

00:03:50
Anthropic, by the way, is definitely feeling the heat like

00:03:53
they rush released. It seemed like their updated

00:03:55
version of clawed like clawed Sonnet 4.1 or something this

00:03:59
week. So it's been and then also open

00:04:01
air had the the open source model that came out, which

00:04:05
everyone for whatever reason, declared not everyone, you know,

00:04:08
whatever Twitter's algorithm wanted to show me like declared

00:04:10
as like a a kind of a bust. It didn't seem like it truly

00:04:14
wasn't met with like the DeepSeek, like ferocity.

00:04:18
Well, DeepSeek was about other people being able to play catch

00:04:21
up to open AI. So open AI, you know, there's

00:04:26
such high expectations. I mean, the thing that open AI

00:04:28
has that everybody else doesn't have is users.

00:04:31
You know, just like there's the inertia that consumers will use

00:04:34
whatever we have. Like we can say oh, the four O

00:04:36
announcement was underwhelming, but it's like I'm going to chat

00:04:39
to UBT and just waiting for the the thing to update.

00:04:42
But I, there's like a, a more powerful model, which I think

00:04:45
there's no debate as to whether or not 5 is better in most

00:04:49
regards to like the earlier open AI models.

00:04:51
Like is that going to drive more growth?

00:04:53
Like who are these model updates really for?

00:04:55
I, I get the like coding side of it.

00:04:57
Like that's a very competitive field and Anthropic basically

00:05:00
owns it now. So like open AI has business

00:05:02
reasons to be a real player there.

00:05:04
But you know what is Chachi BT is at like 700 million weekly

00:05:09
users now. Like is the thing that takes him

00:05:12
to a billion like 5 being that much more powerful or it's just

00:05:16
like generally like there's still more that word that needs

00:05:19
to to leak out to the non initiateds about like ChatGPT as

00:05:24
a software rather than the fact that like 5 can do whatever

00:05:28
logic puzzles slightly better. First I was like, this is an

00:05:31
insane question because it's like, obviously you know, for

00:05:37
consumers, they have to go find how the model can deliver value.

00:05:41
And the smarter it is, the more value they'll get out of it and

00:05:43
the more reasons they have it. If they're using the lot, open

00:05:46
AI will find more ways to charge for it.

00:05:47
So obviously, you know, a better model means a lot of stuff.

00:05:51
And then, you know, real businesses that we'll build on

00:05:54
top of the API will find use cases and be able to charge for

00:05:58
it and hand hold users to the value.

00:06:02
But I, I do agree with the a piece of the question, which is

00:06:05
just like sort of astonishment that these models can be as

00:06:10
impressive as they are today. And there's still lots of people

00:06:14
who don't use them at all. And then some of us that do use

00:06:16
the models, we're not really trying to apply them in all the

00:06:20
cases we should. But but I think that's just

00:06:22
because it's like the the sort of like, you know, businesses

00:06:25
are going to need to build on top of these models to hand hold

00:06:29
us to exactly the value. And so getting smarter is great

00:06:33
for the API business and that's where I think a lot of the

00:06:36
business could be long term. Yeah, but that implies that

00:06:39
people were using earlier versions of the models and were

00:06:41
underwhelmed by it because it was technically not as powerful

00:06:45
as they wanted, when really could just be like they haven't

00:06:47
been exposed to it. I'm talking about normies,

00:06:49
right? Just people that would be using

00:06:51
ChatGPT. I.

00:06:52
Just think you know some hallucinations and you don't

00:06:54
trust somebody you know somebody lies to you a couple times

00:06:56
you're like don't trust you. You know, I, I, I think sort of

00:07:00
consistency is really valuable in building trust.

00:07:03
And so if these models can, can become just more reliable, and I

00:07:07
think that's part of what open AI is saying here.

00:07:09
It hallucinates less and that that could help it a lot.

00:07:13
I mean, it's worth saying. I mean, while we want to get

00:07:16
much more intelligent, part of the big development here is just

00:07:20
that this is a unified model that open AI is going to route

00:07:23
you to, Oh, you. This is a pretty easy question.

00:07:26
We don't need the model to waste all its resources thinking about

00:07:29
it. And oh, this is an intense

00:07:31
question. We'll think about it.

00:07:32
And so that could allow the model to operate great more

00:07:35
efficiently and also just make it life easier for consumers to

00:07:39
get great answers without having to pick the appropriate model.

00:07:43
Yeah, I, I think it's both like a model selector, but also my

00:07:47
understanding is that the models themselves have been improved.

00:07:50
So it's not just like we've routed to the most useful thing

00:07:53
to answer your question, but you know, the the tool, the tool

00:07:57
itself is more powerful. Yeah, I, I don't know.

00:08:00
I mean, I don't go you use this more than I do.

00:08:02
I don't yet pay for the, the, the plus version of it, $20.00 a

00:08:06
month is just I'm not there yet. I'm not sorry.

00:08:09
I'm not even against it. I just like I haven't found $20

00:08:12
worth of value for me that is like that much better than the

00:08:14
free version. I just like proofread it,

00:08:16
feedback, run my stories through it.

00:08:18
What do you think? What am I missing?

00:08:20
Like it always gives me a couple useful thoughts.

00:08:23
Like, all right, it's a new model.

00:08:25
I mean, I think the last word on it to me is just we always try

00:08:28
to like decide whether it's a good model on the first day.

00:08:31
And I think nobody knows right away.

00:08:33
It takes, takes some time. We we write off models and then

00:08:37
they become sort of our daily drivers.

00:08:40
And then other times we get excited and then nothing.

00:08:43
So I think we'll see people play around with them.

00:08:47
Yeah. I also think we can't understate

00:08:48
how significant it is for open AI in terms of, you know, it

00:08:53
needing to be some kind of a leap for them because a lot of

00:08:56
the conversation has been, you know, GPT, everything has been

00:08:59
more or less iterations on four. We came out like a little bit

00:09:03
over two years ago. So this is like a like, there's

00:09:05
no way around this one. This wasn't.

00:09:07
And you know that the back story of building this, you know, this

00:09:09
model was that for a long time they were working on this giant

00:09:13
model. Like I don't even know how many

00:09:14
trillion parameter model that was.

00:09:16
You know, they called it Orion internally and they realized

00:09:19
most of the way through it kept being delayed and delayed and

00:09:22
delayed that it wasn't showing off large enough improvements in

00:09:25
capability for it to be for it to merit being called 5.

00:09:29
And the same time, you know, this breakthrough came with like

00:09:32
deep reasoning and, and in like logic, you know, the ability for

00:09:36
it to like handle more complex gain of thought.

00:09:39
And so they ended up calling Orion GPT 4.5 and it's basically

00:09:43
unused as a model. No one really talks about it,

00:09:45
but 5 is the real deal. Like it has to be something.

00:09:48
It can't just be written off as like, yeah, we're still

00:09:50
iterating on something. And this does not feel, it does

00:09:53
not feel like what the four, what was the iPhone 4, the one

00:09:58
where it's like, oh, new mock or it's like they have the App

00:10:01
Store. And, well, 10X, right?

00:10:03
The iPhone X the first time they had like the full.

00:10:05
Screen, it's not like that it's not one of the big iPhone

00:10:08
moments where it's like, oh, they really changed it.

00:10:10
This is like, Oh yeah, they they come out with new phones every

00:10:12
once in awhile and sometimes we we try to get ourselves excited

00:10:16
about it and that's how this feels to me and I agree yeah,

00:10:20
we've been talking about 5 for a long time.

00:10:22
Thought 5 would be an insane moment.

00:10:24
And so far I'm not seeing that level of like hysterical

00:10:28
euphoria that I, I would have expected, you know, six months

00:10:32
ago for ChatGPT 5. Yeah, which again, like back to

00:10:35
what you said earlier is maybe why people are expecting Gemini

00:10:38
to leap ahead because opening eyes best shot just didn't blow

00:10:42
people away as much as, you know, maybe unrealistic

00:10:44
expectations LED them to think like they're in a big fight in,

00:10:48
in terms of model quality. Now, like you, you've got

00:10:50
Anthropic very capably putting out stuff, you know, Meta, I'm

00:10:53
assuming will come out with something of of value.

00:10:57
And, you know, Google is very serious.

00:10:59
Microsoft still irrelevant, by the way.

00:11:01
Like I'm never not gonna say until they do something real

00:11:03
here. Yeah.

00:11:04
Like it's just a very different landscape than when 4 came out.

00:11:07
Let's talk about a related but different topic.

00:11:11
So the CapEx build out, data center build out, we we're in

00:11:17
this moment. And I keep telling this to other

00:11:19
people AI adjacent. They're like, man, AI is

00:11:21
apparently like propping up the entire economy.

00:11:24
It's like, thank God, you know, we're in the orbiting artificial

00:11:28
intelligence because that's that's the thing, saving the

00:11:31
economy. And you know what they have to

00:11:33
say like. The Wall Street Journal when I

00:11:36
was covering AI so now. You're working for, you know,

00:11:39
company with an AI conference. So even we got even closer to

00:11:42
it. They, they.

00:11:43
Yeah, that was the problem. I wasn't close enough to AI.

00:11:46
Exactly. But the the thing say, you know,

00:11:48
it's the infrastructure build out, that there's real capital

00:11:51
expenditures going into this and that the big tech giants, you

00:11:55
know, are spending aggressively. Yeah, basically what I saw

00:11:57
happen. And it's kind of funny when like

00:11:59
an idea just percolates through, you know, the discourse.

00:12:02
But you had at least three different articles come out in

00:12:05
the last two weeks pointing out. We, we were earlier than some of

00:12:09
this like sorry, there, there. Is this what Kadroski or

00:12:13
whatever he sort of initiated it?

00:12:16
Is it Paul or Peter? Yeah, that's Paul Kadroski I saw

00:12:20
so we. Riffed on that like a couple

00:12:23
weeks ago and now yeah, The Journal Noah Smith A bunch of

00:12:27
people are like, oh, maybe everybody loves a good oh, the

00:12:31
world could end in financial collapse thesis and that's sort

00:12:35
of what what this is right? Yeah.

00:12:37
And essentially and what I think Kedrosky's article did very did

00:12:41
a very good job explaining was like you know and we had some

00:12:43
new GDP numbers coming out. It's also earning season so

00:12:46
people can more easily calculate what like the the CapEx spending

00:12:49
looks like for all the biggest companies, but a non

00:12:53
insignificant percentage of it, what was it like almost 1% of of

00:12:56
GDP growth came from AI CapEx spending.

00:13:00
And that is, you know, like the chart that that Kedrosky put

00:13:04
together showed that, you know, outside of the building of the

00:13:08
railroads and like the telecom build out duringthe.com bubble,

00:13:13
you know, you have you seen that much of GDP growth confined to

00:13:17
one particular sector? And it's a, you know, an

00:13:19
infrastructure build out that at least in the case of AI is like

00:13:22
well ahead of it's, you know, it's, it's broad, broad based

00:13:27
adoption. And I guess, you know, the, the

00:13:30
next question to all of this is like, well, is this a bubble?

00:13:33
Is this something that could potentially bring down the

00:13:36
economy if AI doesn't end up becoming what a lot of people

00:13:40
are expecting? And I, I, you know, I want to

00:13:43
take the bear case on this one because it's more interesting to

00:13:46
me and, and say, like, Oh yeah, we're absolutely circulating,

00:13:50
you know, a, a Bear Stearns like event where, you know, AI

00:13:53
doesn't, you know, doesn't pan out.

00:13:56
But I just don't see it. Like I can't follow the train of

00:13:59
the train of thought that leads me to AI.

00:14:02
You know, CapEx spending is going to somehow when AI doesn't

00:14:06
get as widely adopted, result in like the collapse of, you know.

00:14:10
Like 10%. Unemployment.

00:14:12
This is like Meta going on on the metaverse, like they spent

00:14:16
tons of money trying to do that and like, well, didn't work.

00:14:19
The the key here is that the Max 7 they make tons of money and so

00:14:24
they can spend a bunch on CapEx for everybody.

00:14:27
But Meta, it's still, I think less than their actual sort of

00:14:31
cloud revenue, right? And even if it were bigger, it's

00:14:33
just like they stopped doing it one day and they say, OK, well,

00:14:37
we'll print money again. Like I, I don't see the debt

00:14:40
story. I know some of these things are

00:14:42
like financed, but they're super valuable companies that could

00:14:45
raise unlimited sums of money that produce real cash.

00:14:49
Like, yeah, they're monopoly. What's, what do you like?

00:14:52
If you're a monopoly, what are you going to do with that,

00:14:54
right? If you have just an incredible

00:14:56
margin, you have cash flow like crazy cash flow.

00:14:59
The money's got to go somewhere. You could do stock buybacks.

00:15:02
You could. And this is a good thing.

00:15:05
Like, wouldn't we? I feel like the sort of left

00:15:08
it's like, oh, you know, share buybacks, like exactly what

00:15:11
you're saying. Share buybacks and dividends are

00:15:12
they're bad. They're not doing anything.

00:15:14
It's like here they are. They're excited about something

00:15:17
in the future and they're investing in the real world

00:15:20
infrastructure to run at it. I mean, I think if you believe

00:15:23
AI will be good for the world, then it's like, great, yeah.

00:15:26
So what if they're spending a little bit ahead of, you know,

00:15:30
themselves and they they find out they sort of overbuilt too

00:15:33
soon? Yeah.

00:15:34
I guess like there's an argument that like the actual technology

00:15:37
that you're building is something that's going to be

00:15:38
like job displacing. So like there's a medium to long

00:15:42
term impact of building up the technology there.

00:15:44
But like, I mean, that gets into like the, you know, kind of

00:15:46
technophobic argument of like, well, what do you want to do?

00:15:48
Not invest in this technology that's going to be

00:15:51
transformative like I never and I again, I'd love to take the

00:15:54
bear case here, but I just can't follow that logic of like, let's

00:15:57
let's not do the thing that could potentially create like

00:15:59
God. And the people making those

00:16:02
arguments are always on their iPhones, you know, they're just

00:16:05
like, it's they, they're once like, I mean what?

00:16:08
It's the Chapo's of the world who are like anti tech, but then

00:16:12
they're clearly like tech addicted.

00:16:14
You know, it's like they never choose not to use the products

00:16:18
that they say, oh, we shouldn't we shouldn't build the next one,

00:16:21
even though I love the current one.

00:16:23
Yeah, it never makes. Yeah, I don't.

00:16:25
Want to go too let's I would want to save that part of the

00:16:27
conversation for a different episode because I do think there

00:16:29
is something to say about like the social impact of AI and the

00:16:32
effect that it's having on like intelligence and like

00:16:35
sociability and like. You know, trying to think about

00:16:38
stuff like, I think like people not reading, I mean, we talked

00:16:41
about this on the Rupal Valley podcast.

00:16:44
I made a defensive reading and writing.

00:16:47
I, I, I do worry about that. But to me that's more of a

00:16:51
social media phenomenon. AI as it exists right now in a

00:16:55
chat bot actually makes you like sort of think and write and

00:16:59
engage. I I think it's much more

00:17:01
intelligent than sort of the leaner force.

00:17:02
You're forced me into this argument now.

00:17:04
Yeah, Yeah, I think, I think the intent aspect of, you know,

00:17:08
ChatGPT is much better than the passive and engagement of social

00:17:12
media. There's no question to me that

00:17:14
that is an improvement. I think the idea of and.

00:17:15
Social media is built on engagement, whereas at least for

00:17:19
now, chat bots are predicated on getting you truthful answers and

00:17:23
appearing smart on tests. So obviously they're much more

00:17:27
virtuous than feeds. I think outsourcing writing and

00:17:32
all thought to an AI is like problematic for a society.

00:17:36
Like rather than cogitating over an issue, yeah, yeah.

00:17:39
Or like, I need to fire these employees, how can you please

00:17:41
write this for me so I don't have to deal with, like, the

00:17:44
emotional labor of what it means to communicate to someone is

00:17:47
like a net negative on society. Like I think there's all kinds

00:17:50
of bad things that like asking someone to imitate human emotion

00:17:53
and logic and having that be like, then put in your own words

00:17:58
like has on a society. But I don't think that's what

00:18:01
Kedrosky was getting at in his No, we're pretty far from you.

00:18:06
Force me into it. A piece of it.

00:18:09
Yeah. Look, I think there's also like,

00:18:12
yeah, for the Mag 7. I don't know remember exactly

00:18:15
who's in in the Mag 7 but but for all these companies.

00:18:19
Alphabet. Yeah, these are like cash

00:18:21
machines, like they have no issue investing in in in

00:18:25
infrastructure. They're not taking on huge

00:18:27
amounts of debt. There are companies in the AI

00:18:29
mix that I'm a little bit more concerned about.

00:18:32
Oracle, which I pay attention to, is taking on debt so they

00:18:35
can finance the Stargate shit that they're doing with open AI.

00:18:38
Isn't Oracle just they're like professionally rich and good at

00:18:42
selling tax? I mean, I haven't dug into them

00:18:44
as much as you, but they have they're doing the financing

00:18:48
because they have have money, right.

00:18:50
Yeah. And like so far no one's like

00:18:52
freaking out about Oracle because they're taking on debt

00:18:54
to like, you know, buy the GP us that are going to be in the

00:18:57
Stargate data centers. But they're just in a different

00:18:59
class of company than the MAG 7 because they're not in the MAG

00:19:03
7. And you know, there's also ones

00:19:05
like like core Weave, which you know, is a crazily valued stock

00:19:09
right now and they're taking on they they keep going back to the

00:19:12
debt markets to to do build out. So there are some like there is

00:19:15
some concerning leverage I think for some of these companies that

00:19:19
that are taking on debt, but it doesn't seem like a load bearing

00:19:23
beam that if those things go under like you're going to see a

00:19:26
2008 like event happen or even I'm.

00:19:29
Definitely open to a world where here's the bull case.

00:19:33
Sorry, the bear case to me, which I'm not necessarily

00:19:37
subscribed to, is all the tech stocks are massively inflated

00:19:44
because they're investing in the technology that's driving some

00:19:50
of their revenue and driving their multiples.

00:19:53
Because basically it's like, we're gonna, you know, big tech

00:19:57
companies are the biggest investors in AI startups.

00:20:01
Those startups are turning around spending on cloud

00:20:04
computing and foundation models and stuff.

00:20:06
Where they're, they're generating revenue, so they're

00:20:08
spending money to make money instead of old, you know,

00:20:11
revenue round tripping issue. And then the opportunity of AI

00:20:16
is also increasing their multiples.

00:20:18
So I could see a world where all right, you know, suddenly AI is

00:20:22
not delivering as much as people thought or AI is delivering, but

00:20:26
there's no margin to be had because open source is so

00:20:29
successful. And then all the stocks crash

00:20:32
and they crash because, you know, they're, they're sort of

00:20:36
benefiting in multiple ways, both revenue and multiples.

00:20:39
And so that would hurt them. And I think given that they're

00:20:42
so important to the performance of the S&P 500, it would just be

00:20:46
like a bleak, you know, the American economy would all of a

00:20:49
sudden everybody would feel much poorer because the value of like

00:20:52
the stock market has gone down so much.

00:20:54
But it doesn't seem, it seems like, oh, we're all just like a

00:20:56
lot poorer. I, I don't see this sort of like

00:20:58
ripple effect. They're companies that make a

00:21:00
lot of money. They make way less, they're way

00:21:02
less valuable. And then suddenly America just

00:21:03
feels weaker relative to the rest of the global economy.

00:21:08
But I don't necessarily see like who's going bankrupt.

00:21:13
Yeah. And like, let's just also

00:21:15
remember that, like these companies all crushed earnings

00:21:19
these last two weeks. Like they're foundational.

00:21:21
Bullish. And they're AI.

00:21:23
Yeah. It's like it's clearly this is

00:21:24
where things are going. We spend all our time on the

00:21:26
Internet like these companies are the main entities innovating

00:21:29
in the world. Like they're very sophisticated

00:21:33
businesses. Yeah, I don't, I don't know.

00:21:35
I'm pretty bullish. Yeah.

00:21:37
I guess the part of it that I think is worth being concerned

00:21:42
about or paying attention to is the fact that it is a non

00:21:47
marginal part of GDP growth like spending on these things and

00:21:50
like you have to just pay attention to that.

00:21:52
Like whenever one sector is that responsible for like absent all

00:21:55
of these things that we would kind of have fairly anemic

00:21:58
growth. And you know, the argument also

00:22:02
is like, well, that's fine because this is transformative.

00:22:04
Like we should be investing in these things.

00:22:06
And like this is all going to like later lead toward even more

00:22:08
growth because we're going to have all this efficiency and and

00:22:11
then people will be more productive.

00:22:13
But it's like it's getting into the territory where it's

00:22:16
interwoven into the broader economy.

00:22:18
It's not just like some tech experiment of, you know, some

00:22:22
companies building something that could be cool like this is

00:22:24
now part of it. Like you're, you know, you're

00:22:28
woven into the fabric of what's making the economy work and

00:22:32
problems within that. I don't mean we won't even

00:22:34
foresee what the problem could be, but problems within that

00:22:36
will have ripple effects. And that's it's just worth

00:22:38
paying attention to. Like I don't see the warning

00:22:40
signs yet. But like it's, you know, it's

00:22:43
built into it. Well, it's just also just it's

00:22:46
been too long, you know, in between crashes, right?

00:22:49
Like you know what? I In startups, there was a

00:22:52
downturn, You know, 2021 was a peak.

00:22:55
We came down. There were all these software

00:22:57
companies that were overvalued. We did have a a minor correction

00:23:01
because of the pandemic inflating the value of tech

00:23:04
stocks and driving forward revenue that then slowed down.

00:23:08
But yeah, I came around at just the perfect time.

00:23:12
And like, there's no, like there's no morality play there.

00:23:15
It's like, oh, we should have, we should have really taken our

00:23:17
comeuppance more. It's like, no, we got lucky that

00:23:20
there was like another really cool idea just as tech was about

00:23:24
to take it pretty hard from getting overexcited about

00:23:27
everybody working remotely. And so that that sort of safe

00:23:30
things. And that is the tech idea that

00:23:32
it's like, oh, we'll keep up coming up with new ideas and the

00:23:34
new ideas are are real things and that will create value.

00:23:38
And it's not just like financial games like on Wall Street.

00:23:42
Right. And it's also just kind of the

00:23:43
way things work. Like I I see this argument being

00:23:46
applied to San Francisco and be like man if it wasn't for the AI

00:23:49
boom San Francisco would really be fuck.

00:23:51
Right, that's just a job. They're all hard at work, like

00:23:53
coming up with a new thing, going where it's.

00:23:55
Kind of what they do here, there's always something, man.

00:23:58
It's like, man, if it wasn't for the Gold Rush, San Francisco

00:24:00
would really be. Fight right, right.

00:24:02
Yeah, it's, it's like that's not really the argument you think it

00:24:05
is, because that's just like that's how a boom economy tends

00:24:08
to work. And the fact that, like, San

00:24:10
Francisco keeps coming up with the new thing that matters to

00:24:13
the economy is why, like. Detroit good at is they all left

00:24:17
crypto because crypto stopped making them money and they shift

00:24:20
over to AI. And honestly, you want things to

00:24:22
die so that people can go to the actual productive parts of the

00:24:26
economy and, you know, yeah. Well, you're closer to the

00:24:30
crypto people than I am. But like, like, are they all

00:24:33
just like riding a false high right now?

00:24:35
Like knowing that the the value of all of their holdings is like

00:24:38
not really logical at all That like what?

00:24:40
Do you mean Bitcoin? I mean the dollar is eroding at

00:24:43
the moment. So I mean Bitcoin is like there

00:24:45
is, there is, I mean, I, I don't hold a lot of Bitcoin.

00:24:48
I, you know, I have what, $1000 worth or something, but like

00:24:53
definitely, I, I think, you know, Bitcoin is a psychological

00:24:57
investment thing, just like, you know, so many things and it's a

00:25:01
bet against the dollar. And I, I think the dollar is in

00:25:05
a bad place. So I don't, I don't know.

00:25:07
I, I think the other thing in crypto, which this is, you know,

00:25:10
it's just that it's like fraud is legal and there's tons of

00:25:14
fraud in crypto. And so it's like you can do

00:25:16
whatever you want seemingly. And so I think that's going to

00:25:19
open up some activity. Right.

00:25:21
Well, I'm just thinking about it because, you know, I, I was

00:25:25
reading our buddy Mike Isaac's story about San Francisco and

00:25:29
like, you know, this moment, the, you know, the AI moment and

00:25:32
how we've moved to the hard tech era in, in Silicon Valley.

00:25:36
And you know, he had this like 1.

00:25:37
The clause in there of like San Francisco is used to like things

00:25:40
like blowing up and like, he's like, like crypto and he's like

00:25:43
also things tanking crypto, which is like a funny structure,

00:25:48
but also like, like crypto is not tanking right now.

00:25:51
It's just in terms of like the value of the currencies, it's

00:25:53
actually really high. So that wasn't the best analogy.

00:25:57
Did you read that piece by the way?

00:25:59
Yeah, I, I read some of it. I mean, I, I don't get, you

00:26:03
know, it's just the, the trend story sort of thing.

00:26:07
You know, it's like this is the New York Times, you know, it's

00:26:10
like this, this is where we are. I mean, I, I think it's

00:26:13
generally correct. You know, it's just like, I

00:26:15
don't know, it's not, it's too high level.

00:26:19
I guess, Yeah. I mean, first of all, you know,

00:26:21
Mike's a buddy and I always, you know, want to support my friends

00:26:24
work and, you know, the New York Times, you know, loves calling a

00:26:27
moment and explaining to the normies what's going on here.

00:26:31
You know, there's always like kind of a combination of like

00:26:34
things that have been going on for like years that we just call

00:26:36
is like, oh, this is just happening now.

00:26:37
It's like the year of efficiency I think was like 2022.

00:26:41
So it's hard to say That's like specific to the crypt.

00:26:44
A lot of things about tech are just.

00:26:47
Things humans do and a lot of things people hate about the

00:26:49
media are just things humans do. And a thing humans do are say

00:26:53
what is what is the mood of the moment?

00:26:55
And, you know, there were tech buses.

00:26:56
The, you know, we have these narratives in The New York

00:26:58
Times. And I think Mike's story does a

00:27:00
good job of trying to sort of coalesce, like where where's our

00:27:04
headspace at? And obviously these things are

00:27:05
all, you know. Yeah.

00:27:07
So. But.

00:27:07
There, but there is something different.

00:27:09
I was talking to our our colleague Jonathan Weber about

00:27:12
this. Like I think there is something

00:27:14
worth examining about, you know, what does the AI moment, how

00:27:18
does it compare to previous booms and busts and waves of

00:27:22
technology? And like, there's no question.

00:27:24
And like his article starts off with his interesting analogy of

00:27:27
Silicon Valley and like the rest and vest era and how you know

00:27:33
there was a time starts. Off.

00:27:34
I know we love him, but he starts off in fiction.

00:27:38
He talks about Silicon Valley, the show, and he's like, oh, but

00:27:41
it was really truthy. It's like, oh man, of all

00:27:43
places, I wish the New York Times wasn't sort of doing.

00:27:45
That's like, well, I, I have to, I have to do a disclosure on

00:27:49
anything related to that show because I was briefly an advisor

00:27:53
to it and my brother was an executive producer on it, so.

00:27:55
I was once a storyboard meeting with that, yeah.

00:27:58
Associate Producer. I had multiple meetings in a

00:28:00
cameo. Your brother was like legit on

00:28:02
the show. Yeah, yeah, yeah, yeah.

00:28:04
No, but there was a lot of research that went into it.

00:28:06
So it's very accurate. So I know, but but but but I

00:28:09
will say him using, you know, Silicon Valley and the rest

00:28:11
invest moment. This is about like the pamper

00:28:13
tech, you know, employee. This was like the Google Plex.

00:28:16
This was like whatever Odwalla juices and whatever dumb shit

00:28:19
people associated with the time. And like now things are austere.

00:28:22
We have to go back into the office, which is really, you

00:28:24
know, unfair to make people do. And you know, what we're

00:28:28
building now is maybe more. Era of the $100 million

00:28:33
Facebook. Yeah, but that's interesting

00:28:36
too, right? Like, yeah, is this like an era

00:28:38
of austerity and efficiency because you're making people go

00:28:40
to the office? But also apparently everyone at

00:28:42
opening I got like a $1.5 million retention package,

00:28:46
right? Now instead of spending the

00:28:48
money on perks, which I think some of the perks were like tax

00:28:51
efficient, you know, it's you get, you can deduct food in the

00:28:54
office. So but IT projects softness Now

00:28:56
it's just like, fine, give, give them cash and say work hard.

00:29:00
Yeah, yeah, hundreds of millions of dollars.

00:29:02
That is a fairly select group of people.

00:29:04
I don't know if it's going to have the ripple effect, but.

00:29:07
The last thing we wanted to talk about was, you know, Tim Apple,

00:29:11
Tim Cook being right, right? He's like in the White House

00:29:14
between Trump and JD Vance, right?

00:29:17
Like, just as they're saying terrible stuff, he's giving

00:29:20
Trump 24 karat gold. I don't know enough.

00:29:23
It seems like an insane amount of gold, right?

00:29:26
It's never enough. It's never enough.

00:29:27
It's. Like he's a king.

00:29:28
You know, it's just like the, I mean, the, the thought I had is

00:29:31
just like, did did any of these tech executives really believe

00:29:36
diversity, equity and inclusion? You know, it's just like, OK,

00:29:39
they did the Democratic thing. Now they're doing the like Trump

00:29:42
thing. They're willing to like kiss the

00:29:44
ring for whatever The thing is of the moment.

00:29:46
I don't know, it's it's just insane to me that they'll just

00:29:49
like here, here our gifts are are king.

00:29:52
I don't know what was your reaction.

00:29:53
I mean, Apple can like they are the most vulnerable maybe of all

00:29:58
of the big tech companies to. Keep China.

00:30:00
To Trump's Caprice, like if he's going to start fucking around

00:30:03
with tariffs in a more serious way, they have major problems.

00:30:06
Like you can only like rush production in India one time to

00:30:09
get, you know, out before the tariffs start coming down on

00:30:12
you. So like it wasn't enough gold,

00:30:14
frankly, for whatever. I don't even watch the press

00:30:17
release or the the press conference I missed.

00:30:19
Well, Trump was like singing his praise like Tim Cooks praises.

00:30:23
I mean, he was like. He likes rich people too.

00:30:26
I mean, he's still, even after everything that happened with

00:30:28
Elon, still being like, Nah, he's a good guy.

00:30:31
He had a bad day, but he's still a good guy.

00:30:33
So he's like very specific on like he wants to demonize.

00:30:37
Smartest people are the richest people.

00:30:38
It's like whatever you did to figure out how to get all that

00:30:40
money, you're smart now. Yeah, they're richer than him.

00:30:43
I mean, I don't know, Tim Cook is well, who knows how much he's

00:30:45
worth, but but no, obviously it's it's it's disgusting and

00:30:48
like and and and obvious and like fairly ham fisted.

00:30:53
And I love that he's got it down now that every every president.

00:30:56
I think it started with Trump and then even with Biden.

00:30:59
And now with Trump again, they're like, now we're doing

00:31:02
another like 600 billion, you know, Oh, why 600?

00:31:05
Now we're going to up into eight.

00:31:05
You know, they just like they just keep amping up.

00:31:08
They're like, we're going to invest in America.

00:31:10
You know, they, it's just like, I mean, they just make up like

00:31:14
these press releases basically like I mean the.

00:31:17
Numbers are made-up. The numbers are absolutely

00:31:18
made-up. Cause 600 billion is meaningless

00:31:20
for for Apple, right And. This is the thing I hated about

00:31:21
the media. I mean, which again is just a

00:31:23
humanity thing. But it's like we, we talk about

00:31:28
this is a tangent, but it, we're, we're deep into the

00:31:30
episode now. But, you know, it's like Open AI

00:31:32
gets stories about how they raise $40 billion or whatever

00:31:37
from SoftBank and then that seemingly hasn't totally

00:31:41
happened yet. And then they get excited

00:31:43
headlines again, I think from the New York Times saying that

00:31:46
they raise less than that as if it's like upbeat.

00:31:50
It's just like these these companies are able to get

00:31:53
excited headlines about big numbers over and over again in a

00:31:57
way that nobody's really adding up all all the numbers and

00:32:00
saying, did they? Did they get what they were

00:32:01
supposed to get last time? Yeah, well, open AI is

00:32:04
particularly egregious on on that front because, yeah, the

00:32:07
SoftBank stuff, it's really unclear where, you know, if that

00:32:09
money is even going to come through because it's, you know,

00:32:12
contingent on the conversion. By the way, all these people

00:32:14
that the New York Times mentioned in their story, like

00:32:17
Dragon Ear that are putting in like some sort of side money,

00:32:20
like I think it was some kind of an SPV actually into open AI.

00:32:24
Like they also could be buying into a company that might not be

00:32:27
able to successfully convert into a for profit.

00:32:31
Like the numbers are well ahead of like the actuality on any of

00:32:34
this shit. So it's, it's super problematic.

00:32:37
And also, I mean, like, I never like you can't bring it up

00:32:40
enough. In my opinion.

00:32:41
Is, is, is Stargate And like that also had a huge number

00:32:45
attached to it, $500 billion, which you know.

00:32:48
As far as I can. That was a Trump number, too.

00:32:50
That was yeah. He was a Trump.

00:32:52
Yeah. And was it Larry?

00:32:53
Yes. And was the third person who

00:32:54
was? Larry and Masa, they were all

00:32:56
there and but the 500 billion number was not like, you know,

00:32:59
they have like a very, you know, advanced spreadsheet where they

00:33:02
calculated like project by project and realized over 10

00:33:05
years what it was. Yeah, Like what's what's the

00:33:07
number That's too big though, That would, you know, a

00:33:09
trillion. No one's going to believe that.

00:33:11
And like 200 billion, that seems like not a big number.

00:33:15
So they end up you used to. Work for the publication.

00:33:17
Weren't you on a you're a contributor or something on that

00:33:20
story where the Wall Street? Journal said I got attacked.

00:33:22
Open AI was gonna raise like 6 trillion.

00:33:25
What was? That 7-7 trillion.

00:33:27
Wall Street Journal. What were they doing?

00:33:29
Like, why did that happen? You know, now what's the answer?

00:33:33
Like what? There's only so much I can say

00:33:35
on that one, but I will say that like, you know, you should give

00:33:38
a lot of journalists, put a lot of stock into true at the time.

00:33:43
People were saying to other people that they they were

00:33:46
throwing around and say numbers to each other and.

00:33:49
Well, Sam and making fun of it too, you know, after it came

00:33:52
out, he's like, fuck it, let's do 9 trillion.

00:33:53
And it's like, you know what, Sam?

00:33:55
Like you can't have it both ways.

00:33:57
And that's all I'll say about that.

00:33:59
Right. Sam is clearly out.

00:34:00
He was, I'm sure out and about throwing around big numbers and

00:34:04
then the Wall Street Journal reported and people have the

00:34:07
right to know. But I think the the Journal

00:34:09
should have never put it in the headline, which I think was my

00:34:11
stance at the time. And they should have been, it

00:34:13
should have been much more like in this meeting and with this

00:34:16
person said this number, and they probably should have said,

00:34:19
and that's absurd. I mean, I honestly think that's

00:34:21
our competitive advantage. Like that it we were throwing

00:34:24
around, it's either this is a meaningless number, which is

00:34:27
what I think it ended up being because it's like, oh, it's a

00:34:28
statement about overall spending across like the economy, you

00:34:32
know, or you know, it's just like putting it in the headline

00:34:36
as if it's like a deal number was the insane.

00:34:39
Thing if you pass around the hat enough times, I guess you can

00:34:42
reach 7 trillion. It's like the implication of

00:34:45
that headline. Yeah.

00:34:48
And I actually think the best story that came out that year on

00:34:51
that topic was what the The Times ended up doing, you know,

00:34:55
a bunch of months later where they talked about how all of

00:34:57
these chip manufacturers laughed Sam out of the room when he was

00:35:01
going around to like, whatever at like, TSMC or yeah.

00:35:05
Yeah, they were like they called him a podcast guy or something.

00:35:09
Or. Podcast, bro, which I don't even

00:35:10
know what that means, but I like it.

00:35:12
Yeah, Yeah, 'cause like, yeah, when the big boys, you know, the

00:35:15
ones who actually deal with numbers and like production at

00:35:17
that scale are like seeing you put your fucking pitch deck up

00:35:21
on the screen and they're just like, that's not how it works,

00:35:23
brah, is yeah. That was like the reality check

00:35:28
I was very happy to see happen to Sam.

00:35:29
Yeah, that was a good story there.

00:35:30
That's good. We we criticized the New York

00:35:32
Times. We complimented them.

00:35:34
Yeah, we're like back to dead cat here.

00:35:36
That's. No, Tim, Tim, Apple, super

00:35:39
embarrassing, $600 billion, meaningless number.

00:35:42
I also saw one of the things that came out of it was like,

00:35:44
you know, a headline was like Apple to move production of like

00:35:47
Corning Glass to Kentucky or something.

00:35:50
And someone could Fact Check me on this?

00:35:52
I'm pretty sure they already have that.

00:35:54
Yeah, I think they already do. Well, I think they'd already

00:35:57
said it was like all in the United States, and now they're

00:36:00
like saying it again. And somebody I think pointed

00:36:02
out, like, weren't they already supposed to be?

00:36:04
I think it might have been Mark Kerman saying like they weren't

00:36:07
they already doing it 100% in the US So yeah.

00:36:11
It's very much like if you remember the episode of Mad Men

00:36:13
when he's working with Lucky Strike and they have the ad

00:36:16
campaign for the cigarettes. It it's toasted, toasted.

00:36:19
Yeah, yeah, of course, I remember.

00:36:21
That Yeah, yeah. So that was basically the it's

00:36:22
toasted moment for Apple just being like, now glass

00:36:26
manufactured in the US, never enough of that shit.

00:36:30
And it works on Trump, and it's frankly a lot more realistic and

00:36:33
meaningful easily. You just say Trump, you get

00:36:36
credit for this thing. It's like it doesn't matter if

00:36:38
you were doing it or not. It's just like the now the

00:36:40
credit flows to him anyway. Well, that's way better than

00:36:42
like the Howard Lutnick going around, Howard Lutnick going

00:36:45
around during the tariff moment trying to get people hyped up

00:36:48
that we're going to start doing the little screws and the

00:36:50
iPhones in America. Or he was like giddily going on

00:36:53
CNBC talking about that. Like that one never should have

00:36:56
made it out of the brainstorming session.

00:36:57
So this is better. I I'll give him, I'll give him

00:36:59
credit for that. All right, see you next week.