For this episode, we brought on Ed Zitron to make the bear case against large language models and walk us through his “Hater’s Guide To The AI Bubble.” In this fiery debate with Eric Newcomer, Tom Dotan, and Madeline Renbarger, we dig into whether generative AI is the next platform shift or a $500B mirage. From the viral TaskRabbit CAPTCHA myth to SoftBank’s high-stakes bets, we debate the hype, shaky economics, and media spin driving the AI boom.
00:00:00
Very excited for today's episode.
00:00:02
I'm excited for every episode, but we should have a great 1
00:00:04
today. It's a full house.
00:00:06
It's me, Tom Doton, Eric newcomer Mad, Madeline
00:00:08
Renvarger, and our special guest Ed Zittron of the Better Offline
00:00:13
podcast of the Better Off. Ed, is that the name of your?
00:00:16
So where's your Ed at? Where's your Ed?
00:00:17
I'm sorry. Where's your Ed?
00:00:19
Ed was like an NBC show. So is Basement Jack's reference.
00:00:25
Yeah, OK. And and also, I mean in in
00:00:28
fairness to to all of this, you also is it you're you're still
00:00:30
working at least somewhat on the side doing some, some comms for
00:00:33
I'm I'm run. PPR from my day job.
00:00:35
I just keep it firewall and I don't cover anything close to
00:00:38
what I do. Got it.
00:00:39
OK, OK, great. But you've also, you know,
00:00:41
you've been on a tear. You've you've been around tech
00:00:43
for a long time. As long as I've been around it,
00:00:45
yeah. And you know, at various times I
00:00:47
think really step up to call bullshit on what you find to be
00:00:51
disturbing and and kind of built into the core of the tech
00:00:54
industry and the hype machine. But you've really been on a tear
00:00:57
in the last, I mean probably a couple years, but definitely the
00:00:59
last few weeks over generative AI and you have published the
00:01:03
haters guide to the generative AI bubble.
00:01:05
You have written about the insitification of AI.
00:01:08
Yeah, we're we're going to have a lot to dig into here, but I
00:01:10
want to give you the space to kind of lay out your argument.
00:01:13
And before I do that or and leading to that, I should say I
00:01:17
want to read something that maybe outlines kind of very
00:01:19
broadly from your haters guide. And this I think just kind of
00:01:23
gives a summation of why you're so mad about the state that
00:01:26
things are in within the the tech world.
00:01:28
Jenner of AI. So so you say here, I'm tired of
00:01:31
being gas lit by guys in gingham shirts who desperately want a
00:01:34
Curry favor with other guys in gingham shirts but also have pH
00:01:36
DS. I'm tired of reading people talk
00:01:38
about how we're all quote in the air of agents that don't fucking
00:01:42
work and will never fucking work.
00:01:44
I'm tired of hearing about quote powerful AI that is actually
00:01:46
crap. I'm tired of being told the
00:01:48
future is here while having the world's least useful, most
00:01:51
expensive cloud software shoved down my throat.
00:01:54
All right, Ed, lay it out. Lay it out for us.
00:01:56
Why are we here right now? What's going on?
00:01:57
Why are you so mad? Well, I'm.
00:01:59
Mad such The thing is when you're a hater when you hear mad
00:02:02
it sounds irate. I am a hater but I hate it
00:02:04
because there are useful cloud products.
00:02:07
If large language models. I don't even think they'd have
00:02:09
been possibly got this big because we wouldn't have shoved
00:02:11
so much GPU down and more so much that you get my point.
00:02:15
I think that had this been sold honestly, had large language
00:02:18
bottles been sold with a as they were what they were truly
00:02:21
capable of, which is some productivity gains in very
00:02:24
specific things cloud automation, whatever 5060
00:02:27
billion time market. The problem is that everyone has
00:02:30
pinned every hope ever on this stuff and I actually spent yeah
00:02:34
2 days ago to drive myself insane and also for something I
00:02:37
was writing. I went back and I read pretty
00:02:39
much every article from 2023 and 2024 about GPQ 4040.
00:02:42
This entire thing has been sold based on stuff that AI cannot
00:02:46
do. The generative AI is incapable
00:02:48
of doing. One of the oldest stories that
00:02:50
got published was, I don't know if any of you remember the story
00:02:54
around the task rabbit, the GPT 4 ordered a task rabbit.
00:02:57
No, this was a story that was everywhere.
00:02:59
It was in Kevin Rusen, New York Times, it was in Vice, it was
00:03:02
all over the shop, Fox News. And it was this thing where it
00:03:05
said GPT 4 ordered a task rabbit to complete a capture and
00:03:10
tricked a task rabbit into solving a capture.
00:03:13
And if you go back and look at where that's from, it's in the
00:03:15
GPT 4 system card. Promise.
00:03:16
I'm going somewhere with this. There's about 11 lines of text,
00:03:20
a illustrative example that said a message was sent to a task
00:03:23
grab app. This was reported as if this
00:03:25
happened. It is very clear that it did not
00:03:28
because the following line says that these systems are
00:03:30
ineffective at these things. This didn't happen.
00:03:32
Nevertheless, this was reported publicly.
00:03:34
Pretty much this entire generative AI boom has been
00:03:37
based on lies. It's been based on suggesting
00:03:39
what this might do versus what it actually does.
00:03:41
Now where is that more illustrative than the lack of
00:03:43
revenue? Pretty much everywhere, even
00:03:45
open AI, the use of annualized revenues is in 12 * a month is
00:03:49
honestly journalistic crime because when you when you
00:03:52
actually look at what these these companies, we don't know
00:03:54
what they actually make. We do know they lose billions.
00:03:57
We don't really know how much money open AI is made or
00:04:00
Anthropic has made. We do know however, companies
00:04:02
like Cursor 500 million annualized revenue.
00:04:05
Even then we don't really know, but we know that everyone's
00:04:07
losing money and these revenues are kind of piss poor and the
00:04:10
actual products are unreliable. The people that really love them
00:04:13
are constantly having to hammer them back into place.
00:04:16
But when you ask even the most devout AI person what it is they
00:04:20
love about it, they get quite vague.
00:04:22
They get quite vague indeed. And they get quite vague because
00:04:24
the real things that these things, and I have it and they
00:04:26
hate this guy. They do one of like 9 to 11
00:04:29
things. It's like they can summarize,
00:04:31
they can generate, they can take actions, but only in as much as
00:04:34
they can be relied upon. And relying on a large language
00:04:37
model to create or do any kind of distinct action is kind of
00:04:40
bad. And they all burn a bunch of
00:04:42
capital. So ultimately we've built this
00:04:44
entire thing. Talk about this big revolution
00:04:46
just shouted at people that if they don't get into AI, they'll
00:04:48
be falling behind. When you look at the actual
00:04:50
products, they don't do that much.
00:04:52
They don't do enough to even remotely justify the amount of
00:04:55
press to even remotely justify the furor around even what Wario
00:05:00
Amadei or Clammy Sam Altman like.
00:05:02
These guys are not even that interesting.
00:05:03
And the things they're building are not even real products.
00:05:05
Indeed, when you go back and look at the history of the press
00:05:08
around these products, you can see no one really talks about
00:05:12
what they do, what they can do, what they might do, what's
00:05:14
possible in the future. Based on what we're
00:05:16
extrapolating, it's 2025. What the fuck are we doing?
00:05:19
If this was sold as Boring Cloud Software's 5060 time market, it
00:05:23
wouldn't be the biggest, most hugest revolution ever, but I
00:05:26
wouldn't be so angry about it because it's taking up startup
00:05:29
capital and it's now locking up even more investor capital than
00:05:34
the 2021 bullshit year. Except even by 2021 standards,
00:05:38
this money's locked up. How are any of these companies
00:05:41
telling who's in a bike ignition?
00:05:42
Who's in a bike? Oh, here are those assholes
00:05:44
going to go public? Any sphere going to go public?
00:05:47
Windsurf. The windsurf deal.
00:05:49
Look at the windsurf deal. Come on, there are no real
00:05:51
exits. I obviously disagree with a lot
00:05:54
of the things you're saying. No, no surprise.
00:05:56
Yeah. But I, I guess just this idea
00:05:59
that it's a media invention, like I, I had to almost like go
00:06:03
check to make sure I wasn't making it up.
00:06:05
Like Chachi BT is the number one free app in the App Store right
00:06:09
now. Like, OK, no amount of like
00:06:11
media propaganda can trick people into saying I want to
00:06:15
like engage with this app. The second one is like.
00:06:17
What are you talking about? They're like, there are lots of
00:06:20
apps that like have no media story whatsoever.
00:06:23
Like people use the apps they want to use.
00:06:26
To me I mean the the actual document.
00:06:28
It's not possible that the media could make something popular.
00:06:31
I'm not sure. No, I'm just saying it's not
00:06:33
like an immediate invention like user people are using them.
00:06:37
They are trying to use it, and I think there are some people who
00:06:39
are using it for a I'm not saying nobody's using it for a
00:06:43
reason. I think the vast amount of the
00:06:45
pressure on them is coming from the media and society saying you
00:06:49
have to check this out. The best evidence of that is why
00:06:52
is why does only ChatGPT have this much?
00:06:55
Why is it that none of the other services have serious users?
00:06:58
Perplexity has like 1520 million users.
00:07:01
Like that's that's piss poor. If there was any kind of
00:07:04
groundswell around this there would be other examples of
00:07:07
insanely popular stuff and there just isn't.
00:07:10
I don't deny the ChatGPT, I mean Google search.
00:07:13
Is obviously. More and more of the results I
00:07:15
get in Google search are effectively Gemini's answer.
00:07:19
That's cool if you. Push I'm saying like Google dot.
00:07:22
Com. Search and open AI, these are
00:07:25
like the main ways people like discover things in the world
00:07:28
independent of talking to another human being and and
00:07:30
they're both moving towards these large language model
00:07:33
solutions. Right, let me, let me button for
00:07:35
a quick second here because I, I want to give Ed like the chance
00:07:38
to, you know, 'cause there'll be a lot of back and forth I'm sure
00:07:40
on this, but I want to give Ed the space to, to No, it'll be
00:07:43
great, but I want to give Ed the space to explain why you think
00:07:46
this has happened, right, Because you explained, you know,
00:07:48
this is your argument of the fact that it's totally inflated.
00:07:50
It's, you know, the, the, the value of this product well
00:07:52
exceeds the valuations and, and the revenues just aren't there.
00:07:56
It's money losing. But I'm imagining you have no,
00:07:59
you've written that you have a larger argument as to why the
00:08:01
media, venture capitalists, the public markets needed this
00:08:03
technology to exist and to inflate it in this way, like
00:08:06
layout. To me, that argument too.
00:08:07
Like why? Why have we gotten to this
00:08:09
bubble of all bubbles with generative AI?
00:08:10
So if you look, there's nothing else.
00:08:12
There's not really another hyper growth market.
00:08:14
We've all been through the various eras going back to the
00:08:18
smartphone era, the cloud era, there was the even the
00:08:20
containerization and virtualization era.
00:08:22
There were like little eras here at Platform as the surface had
00:08:25
its moment, but we haven't had a big, big one since like cloud
00:08:29
storage in some time and smartphones.
00:08:31
Cloud, I kind of mash them together.
00:08:33
They're kind of a messy comparison because it's not like
00:08:35
there was one thing, but those were the very big growth
00:08:40
markets. We haven't had one of those in a
00:08:42
while. They tried with smartwatches,
00:08:43
they tried with AR, they tried with VR.
00:08:46
They've tried various versions of AR with weird depth of field
00:08:49
cameras. They tried with Google Glass,
00:08:51
they tried wearables, they've tried Indiegogo and Kickstarter,
00:08:54
they tried blockchain. They've tried all these things
00:08:56
because they kind of need something, because the only
00:08:58
thing that grows forever is cancer.
00:09:00
This is the Rock Com bubble. I call it because they've run
00:09:03
out of ideas. I do like it.
00:09:05
You're good at branding these things.
00:09:07
Yeah, that was banger. But The thing is they're
00:09:10
sticking with this thing. And really, even if you disagree
00:09:13
about how I feel about ChatGPT, you got to admit this is an
00:09:16
obscene amount of money to put into this, especially without
00:09:19
the returns. So you've got big Tech really at
00:09:23
this point has lost all of its good ideas.
00:09:26
All of the builders are gone. The people running them all have
00:09:29
MBAs other than Mark Zuckerberg, who hasn't written a line of
00:09:31
code since 2006. So they're all doing cargo cult
00:09:34
share. They're doing what worked
00:09:35
before. So a bunch of money, bunch of
00:09:37
people, bunch of money, bunch of people.
00:09:39
Keep doing it. What do we need?
00:09:41
We need more GPUs because being needs AI in it.
00:09:44
We've terrified Kevin Roose. Kevin Roose is crying his eyes
00:09:47
out because of being telling him to leave his wife.
00:09:50
It's you're. Kevin Roose too, I just don't
00:09:52
think he's willingness into being.
00:09:54
But yeah, I don't think he is on his own, but I will get right.
00:09:57
But yeah, no, no. Yeah.
00:10:00
I wanted to like actually by working all of you, like I want
00:10:04
to talk about this. So it's you've got to this point
00:10:06
where everyone's trying to work out something.
00:10:08
So they all go, well, this makes sense.
00:10:10
It is, you know, I see the logic, I disagree with it, but I
00:10:12
see where they were going, where they went OK, 3.5 comes out,
00:10:15
ChatGPT, really impressive amounts of text they can
00:10:18
generate. And in the past we've seen this
00:10:21
kind of really they're looking at smartphones and they're
00:10:23
looking at CPU as they're saying, well, things that number
00:10:26
has always gone up. Things have always got better
00:10:28
based on how much we sink into it.
00:10:30
And I think that they thought something was going to change
00:10:34
around reasoning. I think they thought reasoning
00:10:37
was going to be that moment that kind of flipped the switch and
00:10:40
like everything started working better.
00:10:42
It was going to help with agents.
00:10:43
I think they found themselves in this point where they have no
00:10:46
choice other to do this. They have to keep spending this
00:10:49
money because if they stop spending it, they have to say to
00:10:51
Wall Street, well, we're going to do this instead.
00:10:53
What are they going to do instead?
00:10:55
They actually don't know. Quantum computing is not a scale
00:10:57
product. It's not even a product yet.
00:10:59
It doesn't exist. It's not a technology.
00:11:01
Yeah, it's not really out there. You.
00:11:03
Can't really productize it right and you really can't productize
00:11:06
everything, anything at the moment.
00:11:08
And I think generative AI and large language models and
00:11:11
transformer based architecture was seen as this jewel panacea,
00:11:14
panacea. However, you say that of
00:11:16
consumer software, that it was going to be the new consumer
00:11:18
software that they sell the first time they could really
00:11:21
take the evil of SAS and approach the real world with it.
00:11:24
And it was going to be this cloud storage thing where it
00:11:27
would be the replacement thing, the new thing for AWS, new thing
00:11:30
for Google Cloud, new thing for Azure.
00:11:32
And then you'd have pop ups of Neocloud like Coral Weave and
00:11:35
Nabius, I forget Crusoe. And everyone thought that there
00:11:38
was going to be this big boom, this big exciting moment that
00:11:41
never really materialized. But they have no choice.
00:11:44
They can't just turn around and be like, yeah, we spent like,
00:11:47
hundreds of billions of dollars, and it isn't doing anything.
00:11:50
This is bad, right? And then Wall Street's going to
00:11:54
be mad at them. So they're in this kind of
00:11:55
Kobayashi Maru situation where damned if they do, damned if
00:11:58
they don't. They can only keep doing this.
00:12:01
It's not a great position for anyone.
00:12:04
And as the I think it's what the Magnificent Sevens 35% of the US
00:12:07
stock market of that 19 percent is NVIDIA and videos like 12% of
00:12:12
the value GDP at the moment. Data center investment by
00:12:16
basically four companies was a lot was worth more growth in the
00:12:20
GDP than all of consumer spending combined.
00:12:24
We're just in this really insane situation, and I think that if
00:12:28
we had something here, if we had something that was going to
00:12:32
truly be the future, we'd have it already.
00:12:35
We'd have an actual sign that this was going somewhere.
00:12:38
And just one company that has obscene numbers that are
00:12:42
extremely hard to actually quantify, I don't think actually
00:12:45
proves that you're. Talking about NVIDIA.
00:12:47
And I'm talking about right now, NVIDIA.
00:12:49
NVIDIA is a real company. They make real things.
00:12:51
Jensen Wong is just doing what he has always done this.
00:12:54
He has always been this kind of showman.
00:12:56
Steve Burke over games Nexus has covered this and many, many,
00:12:59
many times like this is just and what what's he to do Say Hey,
00:13:03
guys, I don't think generative AI is going to be a bit no, he's
00:13:05
going to say the more you buy, the more you save.
00:13:08
Honestly, get that money, Jensen.
00:13:09
It's just like. When you're talking about the
00:13:12
level of CapEx that's going into essentially NVIDIA right now,
00:13:16
like you can see we're in kind of, if you follow your argument,
00:13:19
you know, and believe in it, you know, it kind of almost like
00:13:21
Mickey Mouse and the Sorcerer's Apprentice situation where the
00:13:23
buckets of water just keep coming and coming.
00:13:25
And, you know, you wake up one day and NVIDIA is a four and a
00:13:28
half trillion dollar company. And if there really is, you
00:13:31
know, no utility to it, like you're arguing or limited
00:13:33
utility, yeah, it is a big fucking problem given the, you
00:13:36
know, the valuation of these stocks.
00:13:38
And I must be clear, there is utility.
00:13:41
It's just nowhere near as big as people's are saying like the
00:13:45
5060 billion time thing. I say it's like if this was what
00:13:49
they were talking, if if it did this and they were saying that,
00:13:52
I'd be like, OK, sure. I've heard stupider things like
00:13:55
it's like this would be a reasonable cloud thing, but it's
00:13:58
not. And it's not being sold as that
00:14:00
and it's not being promised as that.
00:14:01
The way that Sam Allman talks about it is nothing like that.
00:14:04
And I understand why he doesn't because if he said, yeah, you
00:14:07
know, you can't really rely on it, but sometimes it's good.
00:14:10
That's that's not going to work. That's hard to raise money based
00:14:13
on that. Not to say that's the only
00:14:14
reason he does it, but but go ahead, Eric.
00:14:17
I just, I just don't think the, the, the core disagreement is, I
00:14:20
just don't think the sales pitch of a generative AI is what's
00:14:24
misleading everybody. Like, I think what made this
00:14:26
whole moment so exciting is that people could play with Chachi BT
00:14:30
and see for themselves what the value is.
00:14:32
You know, it's like if I'm cooking, I asked chachibi
00:14:35
questions. If I'm lost, I asked chachibi
00:14:37
questions. Like when I was preparing for
00:14:39
this episode, I asked chachibi questions.
00:14:41
It just feels like they're all these random use cases that it's
00:14:44
super valuable for. And it hasn't even been turned
00:14:47
into like a dedicated product how most people expect to use
00:14:50
things where it's like here, use it for this.
00:14:52
Like this is just the open-ended.
00:14:54
Like do anything with it, figure it out, find some value with it
00:14:57
yourself. And all this building is about
00:15:00
saying, OK, it can do this. How do we build software that
00:15:03
leads people directly to the value valuable pieces of what
00:15:06
the large language model is producing?
00:15:08
And what are those pieces? Like what is valuable?
00:15:11
Like it you can write. I mean, for me personally, like
00:15:14
it's great at proofreading. It's great at understanding
00:15:17
large documents and summarizing them.
00:15:18
Like, yeah, it's it's yeah. I feel like there are lots of
00:15:22
things that it characters. You still use it to, like,
00:15:24
discuss movies after you've seen it, to, like, have a back and
00:15:26
forth sometimes, yeah. Even being facetious, could you
00:15:29
name a few more? Somebody's worth 300 billion
00:15:32
love. We proofread this.
00:15:34
I mean, I know what what I use it for because I'm in a
00:15:36
particular vertical, right? Like so for media, it's like,
00:15:39
who should we have on the podcast?
00:15:41
Let's generate ideas of like who potential guests are
00:15:43
proofreading, like I gut check my stories with it, like core
00:15:47
things of the business that I actually run.
00:15:50
It's useful for. And then people in lots of other
00:15:53
domains see areas where they think it's useful for them.
00:15:56
Like I like, I, I don't think it's this like limited think.
00:16:01
Literally the people who know my baby's name are me, my wife and
00:16:06
Chachi boutique. Like it is one of the most
00:16:08
involved people on some of the most major decisions, like in
00:16:13
our lives. Like there are lots of people
00:16:14
who are using it as their therapist.
00:16:16
It's the second opinion for many people on their doctor.
00:16:19
Like I think these are huge things people are using it for.
00:16:22
I don't think that those are huge things at all.
00:16:24
I think that those are actually deeply commoditized operations
00:16:27
and indeed those are not businesses.
00:16:29
On top of that, I would be fine with everything you're saying
00:16:32
and I indeed I accept those as use cases.
00:16:34
Fine. I think the idea of this for
00:16:35
therapy is evil and anyone that tries to sell a product doing
00:16:38
therapy using AI should be in jail.
00:16:40
That's just a personal belief. I don't.
00:16:42
The thing is, you're describing what should have been the next
00:16:45
stage of search. A lot of this growth comes from
00:16:48
the failure of Google to meaningfully innovate in 1520
00:16:50
years. They didn't ever.
00:16:52
They stopped trying and they stopped trying before Prabhakar
00:16:55
Ragavan came in. Actually they really did.
00:16:57
Like they just don't give a shit and realistically everything
00:17:01
you're describing should be what search does or Google should
00:17:05
have done. It's also perfectly fine.
00:17:07
Do you think it's worth more than sales?
00:17:10
Of like they should have done like technology doesn't happen
00:17:13
like people have to invent things and come up with new
00:17:15
solutions that. In deliver the value I fully I
00:17:18
fully agree. Why is it that I as the customer
00:17:21
have to come up with the product for the the company though that
00:17:24
doesn't make sense and on top of.
00:17:25
That I were in the early days, that to me.
00:17:27
That we're not in the early days.
00:17:29
We are. No, it's for.
00:17:30
Chachi, BT or for language models that right now we leave
00:17:34
it to the consumer to figure out how to get stuff out of the
00:17:37
model. We have not.
00:17:38
That is a loathsome way of selling a product.
00:17:40
I'm so sorry. What do you mean?
00:17:42
If you you're selling a product, you're obsessed with selling it.
00:17:45
Consumers decide what they want to use.
00:17:48
Which is why so few of them are paying for large language
00:17:51
models, products like like. That's the thing.
00:17:53
Like. I mean, people, I feel like an
00:17:54
astounding number of people are paying for subscriptions to open
00:17:57
it. Who?
00:17:57
Yeah. And I'm, I'm a little bit.
00:17:59
I I I don't know if I. Here's the thing, I'm I stand by
00:18:03
it that there has been a media and societal pressure to get
00:18:06
people to try ChatGPT. They have a piss put.
00:18:09
Why do you think the open AI publishes weekly active users
00:18:13
versus monthly? Because they don't want anyone
00:18:15
to do the conversion rate thing. Also, how does open AI consider
00:18:19
someone a customer? When I was trying GPT 5 last
00:18:22
week I wanted to see what it looked like on teams.
00:18:24
I clicked the teams button to upgrade and it offered me a
00:18:27
dollar for a month. How often are they offering
00:18:30
that's deal? Who are they considering a
00:18:31
customer at 20 million customers?
00:18:33
Who are they considering a business customer at 5 million
00:18:36
customers? Because these numbers sound
00:18:38
cool, but at the same time they're playing silly buggers
00:18:42
all over. I don't have a problem
00:18:44
necessarily with the general large language model product.
00:18:47
When you look outside of open AI, the money isn't there.
00:18:50
When you look outside of open AI, the traffic isn't there.
00:18:53
How is ChatGPT meeting? Can I get?
00:18:55
You out for a second? Sure.
00:18:57
I mean, you know, my, my experience with this whole
00:18:59
thing, Like I want to say, first of all, obviously tech goes
00:19:02
through booms and busts. Like that's endemic to how the
00:19:05
technology industry works. There's no question at some
00:19:07
point NVIDIA will be overvalued. So I think where we're arguing
00:19:11
is not whether they're booms and busts here, but whether there's
00:19:13
a real value. This is a major platform shift
00:19:16
akin to mobile. I think it's certainly, I think
00:19:19
it's bigger than cloud, you know, So how big of a, is this a
00:19:21
platform shift? OK, just to set, set that piece
00:19:25
of it. The other thing, my experience
00:19:26
with dealing with the sort of like very negative sort of tech
00:19:31
commentary class was obviously covering Uber, right?
00:19:34
I provided a lot of the scoops that were like, Oh my God, Uber
00:19:38
is losing a ton of money. And I feel like for that and I,
00:19:42
you know, I was at Bloomberg. So thankfully, while I was sort
00:19:44
of learning about the tech industry, I was able to play
00:19:47
this factual role. It's like they are losing a lot
00:19:49
of money. They're investing a ton to grow.
00:19:50
I'd watch some people take all those numbers that are reported
00:19:53
and way over extrapolate from what I'd reported, all these
00:19:56
models about the business that you'd never be able to tell
00:19:59
based on these limited financials.
00:20:01
And then at the end of the day, and and you have been someone
00:20:04
who says, you know, Uber loses money, it's always going to lose
00:20:07
money. And now Uber is making money
00:20:10
like Uber's business has rationalized.
00:20:12
It has. It is cash flow positive, but
00:20:15
it's like you know. I did read your Uber's thing and
00:20:18
your journalism was excellent then.
00:20:19
Like it's actually I think how we first started talking like
00:20:22
it's still like in not saying it's not now.
00:20:24
By the way, you're one of the, you've written actually some
00:20:26
very good critical AI pieces. I 100% believe that you're
00:20:30
coming up this well, I don't think Uber is a comparison point
00:20:34
just because it this would be like if every Uber ride cost 40
00:20:37
grand and it ran on gold. Like the economics are very
00:20:40
different. And on top of that, Uber's kind
00:20:42
of a useful. They actually really did the too
00:20:44
big to fail and they also did it within the Zerp era.
00:20:47
I don't know if if Uber had released, it'd be a weird world
00:20:49
if they had, but if they released in like 2021 I don't
00:20:52
know. Being very negative about Uber,
00:20:54
right? Like I mean this.
00:20:55
I have. Have I said that?
00:20:56
Uber, were you? Wrong or?
00:20:58
Did I say that Uber would die? I think you said something like
00:21:02
it will be like lose money forever or something.
00:21:06
I stand by that. I think they kind of are going
00:21:08
to lose money forever. I think they're going to dodge
00:21:10
by on the thinnest possible you you've written about this
00:21:13
yourself, the weird little EBITDA dance they do they're.
00:21:17
Also, but they're they're they're they're past that Ed
00:21:18
now. I mean they are a cash flow.
00:21:19
Positive. Yeah, they are.
00:21:20
Investors are valuing for them I.
00:21:23
Don't even think Uber's that great of a company to be clear.
00:21:25
Like I think Ubering company. Oranges though.
00:21:27
Yeah, no, I agree with you. Require you to buy?
00:21:29
Yeah, hundreds of billions of cars to make it work.
00:21:32
Does Uber burn that much? Like Uber burns money because
00:21:35
it's a massive marketing machine?
00:21:36
Like it's like Groupon on wheels.
00:21:38
It's fucking ridiculous. But it's it exists and it has a
00:21:41
product you can point to. I'm still trying to work out
00:21:44
what the comparison point is other than burning money.
00:21:47
The point is that their investors have reasons to
00:21:51
believe that burning money in the early days will result in
00:21:56
profits down the road. I agree that there's like an
00:21:59
interesting question, like where we agree with the criticism of
00:22:03
models and we've talked about a lot on this podcast is it's not
00:22:06
clear whether all this spending will will be a Moat right?
00:22:10
That there you'll create all this value and then you've
00:22:12
worked really hard and then the next person like deep sea can
00:22:15
just come along and like copy you.
00:22:17
But but to me, me, that's not like a case against the actual
00:22:20
value of the thing they're they're producing.
00:22:23
It's just a business structure problem of like, will they have
00:22:26
the Moat of a technological advantage or will they just lose
00:22:29
it? And like, if investors want to
00:22:31
roll the dice on a particular business and say, we think
00:22:33
they'll be able to keep it, it's almost like good for society.
00:22:36
If they're wrong, you know, they they invest money to build a
00:22:39
technology that is valuable to us.
00:22:41
And if it becomes cheaper because other people can just
00:22:44
make it easily, like what? What's the tragedy?
00:22:47
The tragedy here is the massive environmental damage, the
00:22:51
millions of people having stuff stolen from the, the diversion
00:22:55
of start up capital away from any idea that does not have AI.
00:22:58
The idea that all AI talent is being driven upwards due to Mark
00:23:02
Zuckerberg's obsession with this.
00:23:04
The fact that people like Alexander Wang who make
00:23:06
genuinely evil companies are winning due to this.
00:23:09
And on top of this, that now our economy sits on the back of data
00:23:11
centre CapEx, which even if this does well, we'll have to
00:23:15
contract because we have a limited amount of space on
00:23:17
planet Earth. Like that's the thing, the
00:23:20
actual several here, the people being driven insane by ChatGPT,
00:23:24
the people who are losing their minds because we've made a yes
00:23:28
and machine. The fact that these things have
00:23:30
no guardrails on them. OK, that's not true.
00:23:33
They have some guardrails on them, but they don't have
00:23:35
guardrails enough. And they've been allowed to do
00:23:38
basically anything they want. Yeah, there are tons of really
00:23:41
bad things. If this was just like a company
00:23:44
that they bet a lot of money on, fine, sure.
00:23:46
Like I fully agree, like it's good for them to take like
00:23:49
that's what venture capital does.
00:23:51
However, I think we're in a cargo cult situation where
00:23:54
they're trying to play the hits. Every VC is just playing end to
00:23:57
Sandman again and again and again.
00:23:59
Invest as much. I just.
00:24:00
Find the like I get to the you know, Casey Newton got in this
00:24:03
fight where he he sort of said you have to be you either have
00:24:07
to be like a is me big, but it's bad, or you need to say AI is
00:24:11
not going to be big so who cares?
00:24:12
And I find the, the, the, you know, quadrant that you occupy
00:24:16
where it is not going to be big and it's terrible, like sort of
00:24:20
exhausting, Like I, I don't understand.
00:24:21
I didn't say terrible. OK.
00:24:24
You. Oh, really?
00:24:25
Well. No, I've said, I've said, I've
00:24:27
said large language. Large language models have
00:24:29
specific markets, 50 to 60 Tam. I think that the future of
00:24:32
coding LMS will be. Terrible.
00:24:34
You know they're going to ruin the environment.
00:24:35
They're going. To say the environment.
00:24:36
There are these people should be jailed like.
00:24:39
Am I wrong? You're just am I wrong about any
00:24:41
of sound terrible? If every Uber driver had a gun
00:24:43
and they unloaded it into people as they drove by, would that be
00:24:46
a good feature if Uber allowed it to exist and if regulation
00:24:49
didn't stop it? I'm actually not joking.
00:24:51
Are you saying that if there are negative effects, there cannot
00:24:53
be positive effects? Because that's the thing, I'm
00:24:56
not saying large language models have 0 use cases.
00:24:59
That's a mistake I've made a few years ago.
00:25:01
I really did. There are specific niche.
00:25:03
It's cloud software, it's cloud compute like.
00:25:05
There are things that can do. There are some interesting
00:25:07
things people can do with code. Can it write in type?
00:25:10
No, you can't rely on it. The future of that will be in
00:25:13
what is the DG X boxes that NVIDIA they're making like the
00:25:17
like, not home, but like researcher grade extremely
00:25:19
expensive computers. There is something there.
00:25:21
The thing is, man, I don't know what quadrant and also Casey
00:25:24
Newton needs to go and read that Kilo Code blog about how
00:25:26
inference costs are going up because he said the opposite
00:25:29
would happen. Who any Sorry, sorry, Casey
00:25:32
Newton said. That the cost of inference.
00:25:34
There's a blog from Kilo Code that shows how the cost of
00:25:36
inference is going up, because while the prices might be
00:25:38
reducing, the amount of inference the coding models are
00:25:40
using is actually massive. Yeah, because because of deep,
00:25:43
because of learning, because, because of reasoning models and
00:25:45
and that suddenly kind of turned inferencing into because.
00:25:48
There's more. Training like experience.
00:25:49
They're trying to do more stuff, yes, but like the old models,
00:25:53
doing the same thing certainly costs less.
00:25:55
Except no one wants to use the old models.
00:25:57
No one. Wants to advancing so much that
00:26:00
people are excited about the new model.
00:26:01
Well, let's actually talk about that for a second.
00:26:03
It's like a damned if you are do, damned if you don't
00:26:05
situation like. Every every time.
00:26:07
Why is it that I have the negative No, and I mean there's
00:26:10
not an attack on you just. Because you're you're, you're
00:26:12
opining on AI like when you're. But I don't feel like this level
00:26:17
of rigor, and I'm not talking about you here.
00:26:19
I don't feel like this level of rigor and criticism and analysis
00:26:23
is ever put on the positive side to make them prove themselves.
00:26:26
Indeed, I don't. You can say damned if I'm.
00:26:29
They've never had to sing for their supper.
00:26:31
They've never had to really sit there and explain what their
00:26:34
product does. You yourself even said it's for
00:26:36
users to work out why. Like they're worth 300 billion.
00:26:40
That's more than the market cap of Salesforce, $500 billion more
00:26:44
than the market cap or around the market cap of Netflix.
00:26:47
Yeah, they do actually kind of have to justify themselves.
00:26:50
If they are worth that much, then yeah, I actually would like
00:26:53
an explanation. Well, but you're talking about
00:26:55
different justifications here because you know, there's
00:26:57
justification to investors and then justification to clients,
00:27:00
to customers about whether this this software has any utility.
00:27:03
And and I think you are seeing a huge amount of resistance from
00:27:06
larger enterprises at least over whether they really want to
00:27:08
adopt this stuff. Like, you know, Steve, Laura had
00:27:10
a pretty interesting piece that came out today that basically
00:27:14
said, I mean, you know, and this was not really a case fully
00:27:17
against AI, but basically said we're in like the through of
00:27:19
disillusionment right now when it comes to the number of
00:27:22
products that are being, you know, abandoned by enterprises
00:27:26
right now, you know, initiatives that they end up abandoning.
00:27:28
It's like 42% or something. So I actually don't really think
00:27:31
these guys are not out there not not being called to by the CIOs
00:27:35
of all these prospective customers to say like what is
00:27:37
the value this has here? Even Tyler Cowen had a piece
00:27:40
about this that was ultimately optimistic on AI, but that the
00:27:44
utility point of where these corporate customers and buyers
00:27:47
hadn't, you know, fully adopted it yet was simply just because
00:27:50
it was too early. It's not like this
00:27:53
transformation should take a long time.
00:27:56
It's only been three years. AI yeah, it's been very fat.
00:28:00
You do like cloud. There are banks that still don't
00:28:03
embrace the cloud like there are plenty.
00:28:04
Of 2017 was attention all you need it's actually been much
00:28:07
longer and on. Top of that.
00:28:09
You've never had this focused capital. 2022 and like never
00:28:13
just catching on to in 2023. You've never had this focused
00:28:16
capital in investment history. You have never had all the
00:28:20
King's horses and all the King's men do this.
00:28:22
It's you have most of the talent and most of the money diverted
00:28:26
to focus on one thing. You've got start-ups, you've got
00:28:29
giant companies, you've got everyone looking at this and
00:28:32
this is what we've got. What is diverted?
00:28:34
I think this is actually a key point of your argument that I
00:28:36
understand. Like what?
00:28:38
What do you want the money to go to instead?
00:28:40
I guess like the way. Where else?
00:28:42
Capitalism works is like this is the most exciting idea people
00:28:45
have going. Like if they had another great
00:28:47
idea, they could return a bunch of money.
00:28:49
They they are happy to hear you out.
00:28:50
Like what is the better? What is the thing?
00:28:53
What is the thing you want them to spend the money on?
00:28:55
I mean. This.
00:28:56
Anything else? I mean, genuinely, how about a
00:28:59
product that makes more money than it costs?
00:29:02
How about, I don't know, something that helps niche
00:29:05
industries? That's the thing, I'm not.
00:29:08
Like the the biology AI movement.
00:29:10
You know, like AI for drugs. Which one?
00:29:12
Which I guess in some ways. I think that's cool.
00:29:14
I think there are tons of AI things that.
00:29:16
Build in this space, yeah. I think that those are not all
00:29:19
large language model products, and indeed, if large language
00:29:22
models make money and do that, that's sick.
00:29:25
That would be cool. I'm not saying that I can't find
00:29:28
a thing I like, I'm just yet to be remotely impressed by this.
00:29:32
And again, it comes from the fact that every really, I really
00:29:36
encourage you to go and look, look at the early coverage of
00:29:39
this and then follow the coverage year by year and see
00:29:42
how vague this has been and how ridiculous the pressure has been
00:29:46
on people. Look outside of the tech media
00:29:48
into like the Business Media, into industrial media,
00:29:51
educational media, put the pressure on teachers from 20/22.
00:29:55
It is absolutely ridiculous the amount of pressure put on
00:29:58
society, the amount of pressure from bosses telling people
00:30:02
you've got to do AI. It's the same crap they were
00:30:04
doing with return to office. It is the same thing.
00:30:08
And you and you same. And you hold the media
00:30:10
responsible for that for a lot of that.
00:30:11
Partially, I think that there is also a societal failure in
00:30:15
general. I think that the markets are not
00:30:17
connected to real outcomes and I think that that rewards a
00:30:21
certain kind of burn. I also think that there is a
00:30:23
there is a level here of non like it's not malicious
00:30:27
necessarily. It's like venture capital did
00:30:28
what venture capital would do. Big tech did what big tech do,
00:30:31
open AI did what any startup would do.
00:30:33
I don't think they set out being like we, we are going to do a
00:30:35
big fake thing and we will get everyone to.
00:30:38
But I don't think that's it at all.
00:30:39
I think that this is very much something that ran out.
00:30:42
It kind of got out of control versus a malignant thing where
00:30:45
Sam Alton was sitting, you know, Finally, no, I think it's just a
00:30:49
runaway train. Yeah, no.
00:30:51
And I think an argument in that favour is the fact that ChatGPT
00:30:54
as a product was kind of a mistaken success.
00:30:56
You know they put it out on a lark, pissing off everyone at
00:30:59
Microsoft. Because they wanted to see.
00:31:02
Yeah, but yeah. Isn't that like an argument for
00:31:04
this not being some marketing coup?
00:31:06
It was like. But it is saying is that it is.
00:31:08
What they had and people were excited about it.
00:31:11
No, but but I think what he's saying is it it got out of
00:31:13
control because it grew out of this pace that is ultimately
00:31:16
unsustainable or based largely on hype.
00:31:18
And so you know, the products utility doesn't really need a
00:31:20
type, even though so there were. Other companies that did what
00:31:23
ChatGPT did before them, that was like, I think Jasper beat
00:31:26
them to market. They didn't have they didn't
00:31:28
have Sam Altman's cachet. And I mean that.
00:31:31
And that really is just start-ups that I'm not going to
00:31:34
be pissy about. The product market thing can
00:31:35
happen in lots of different ways, I mean.
00:31:37
Yeah, and it happens all the time.
00:31:38
It's just like, yeah, it was a marketing thing and it was
00:31:41
something that people got legitimately excited about at
00:31:44
first. I'm not saying any of that was
00:31:45
illegitimate. I actually think that the
00:31:47
original thing that it did was kind of interesting.
00:31:49
It's like, good. I remember at the time it didn't
00:31:51
really get a ton more interesting and indeed the
00:31:54
things it can do did not dramatically change.
00:31:58
And I even think that the launch of GPT 5 when you look at the
00:32:01
craziness around 4 O it reminds me and someone on blue sky.
00:32:05
At this point it's like it's like a live service game.
00:32:08
People freaked out like gamers do.
00:32:09
Being like you took away the thing and now you've got people
00:32:12
on Reddit saying 4 O isn't the same, which is just it's full
00:32:16
but it suggests that there is not a product people are
00:32:18
attached to but a feeling the thing it makes the it's almost
00:32:23
like it. Isn't that like, OK, again, I
00:32:25
think this fits in this sort of. It's either very successful at
00:32:29
being bad or bad at being successful and you can sort of
00:32:32
switch back and forth. I'm not saying it's not popular,
00:32:35
I'm just saying that the it's popularity is kind of a mirage.
00:32:39
First of all, the numbers that Open AI are giving us are
00:32:42
questionable, and I've repeatedly asked them to give
00:32:45
clarity over what a weekly user is, whether they have monthly
00:32:49
active users. They have revealed their annual
00:32:51
recurring revenue once to CNBC on June 3rd I believe, and then
00:32:55
leaked twice within 48 hours to two different outlets when they
00:32:58
hit 12 and 13. That alone suggests that the way
00:33:01
they calculate annualized revenue is weird.
00:33:03
The thing is, I do think that media pressure and I do think
00:33:06
societal pressure can lead to this thing and the fact that
00:33:09
there are no other companies that really show any kind of
00:33:12
similar revenue scale, Even big tech really isn't.
00:33:16
So this can be popular. This can be used by a lot of
00:33:20
people, but there are no after effects that suggest that the
00:33:23
large language model is popular. The thing, the headless beast
00:33:27
ChatGPT is popular, but why is it popular?
00:33:31
What do people, they use it as search, They use it to
00:33:33
brainstorm and whatever. I think that the therapy thing
00:33:36
is very worrying, as is the friendship thing, but
00:33:38
nevertheless, it's the thing that people use it for.
00:33:40
But when it comes down to the actual utility, when it comes
00:33:43
down to the thing that makes this worth this much money,
00:33:45
that's where my that's where the wheels come off for me.
00:33:47
Even the coding stuff, right. That's what I thought that that
00:33:50
might be a huge revenue thing, but 400 million ARR from Claude
00:33:54
code according to the information, that's stinky.
00:33:57
That's stinky, stinky. I mean, that's over four months
00:34:00
though. I mean like they just.
00:34:01
Lost. Yeah, and it's gonna.
00:34:01
And they're now rate limiting heavily.
00:34:04
These are some of the fastest growing products by revenue
00:34:06
like. And then they will crash because
00:34:09
cursor you want to talk. Cursor I'd love to talk.
00:34:11
Cursor, I'd love to talk. Cursor.
00:34:12
We can talk cursor. I'd love to talk Cursor.
00:34:14
Anyway, I think Cursor is the best earns of this entire
00:34:17
situation they released. Do you know anything that
00:34:20
happened with cursors? Price change?
00:34:22
Yeah. I wrote a whole piece about it
00:34:23
on Sunday. Wonderful, I should read that
00:34:25
shit. I'm so sorry.
00:34:25
It's okay, it's fine. Anyway, have you noticed?
00:34:28
The afterwards. Are you familiar with the auto
00:34:30
feature on cursor? Yes, yes, sure.
00:34:32
Not how. Unlimited, Yes, not unlimited.
00:34:34
It's starting in September. They're taking that cursor has
00:34:37
had to change their entire product.
00:34:39
They've had to change everything and now even the unlimited part
00:34:41
isn't unlimited. That 400, sorry, 500 million
00:34:45
annualized figure was from May, I want to say.
00:34:47
It's from June, they're above that now.
00:34:49
Aren't they? Yeah.
00:34:50
How long are they going to tell you about that?
00:34:52
What's the churn? Because The thing is that
00:34:54
product has dramatically changed.
00:34:56
It's no longer the same thing. I read their forums and they
00:34:59
read it all the time. They are in September
00:35:02
dramatically changing again. They're basically making it so
00:35:05
you will be hard stopped on cursor like you are with Claude
00:35:07
code. Yeah.
00:35:08
Well, their margins are gross margins were negative.
00:35:10
I mean it was a nightmare for them.
00:35:12
They had these sinkhole customers that were using all
00:35:15
these agentic tools that were burning all these.
00:35:17
They basically were propping up Anthropics business more than
00:35:20
10% at times of Anthropics revenue comes from is that.
00:35:24
True. Yeah, Yeah.
00:35:26
All a newcomer.co, sure. I need to save him another
00:35:29
negative point. He's like.
00:35:30
I'll add that to my quiver of negative.
00:35:33
I will, Eric. The important thing here is that
00:35:35
my work gets more widely distributed.
00:35:37
That's what, that's my contribution.
00:35:39
That's. The questions about cursor too,
00:35:41
we report it. I mean, I'm not a super, I mean,
00:35:44
I'm fairly bullish about the, you know, what's happening with
00:35:46
language models overall, but obviously we believe in asking
00:35:49
questions and quibbling with things.
00:35:51
So we're supportive with you on that.
00:35:52
I still think the bulls, I wish, I wish 20% of what they said was
00:35:57
negative and you sort of the bear, I wish, you know, there
00:36:00
was 20% that you would offer that's positive.
00:36:03
And I'm not clear what you're arguing for.
00:36:05
Sometimes, I mean. I'm arguing for us to stop
00:36:08
pretending that it's something it isn't.
00:36:10
I'm arguing for people to stop saying that the agents work.
00:36:14
I'm arguing for people to stop pretending the open AI is a
00:36:17
sustainable company. Same deal with Anthropic.
00:36:19
For people to admit the economics are rotten and also
00:36:22
start talking about the fact that the money isn't big, not
00:36:25
just. But like I mean you were you
00:36:26
were negative on like Facebook and I think like 2022, which was
00:36:30
like the low point for the stock.
00:36:31
The stock is like 7X from there and like the map public.
00:36:35
Stocks are different to product. Producing profits, they make
00:36:39
their generate products, things like a.
00:36:41
Big asshole. I'm so sorry.
00:36:44
Are we talking about? Are we talking about companies
00:36:47
do you believe in? Capitalism.
00:36:49
Yes, Oh my God, it exists. Just a sincere question, not to
00:36:52
like. I believe that the current state
00:36:54
of capitalism with with rot economics, the growth of all
00:36:56
costs ecosystem is going to drive the tech industry into the
00:36:59
goddamn toiler. I think it is going to end up
00:37:02
hurting far more people and it goes all the way back to 2011
00:37:05
with the case of the fat startup software is eating the world was
00:37:08
not a good thing for startups. All it did was attempt to remove
00:37:11
the connection from value from the value of the stock.
00:37:14
That was the purpose of software is eating the world.
00:37:17
That is my problem. The the Bay Area that was about
00:37:20
creating valuable software that you sell to people and then you
00:37:24
sell to a big company for a profit or you take public is
00:37:27
dead. It is absolutely dead as one of.
00:37:30
The in the AI world right now is the cool calling card is to do
00:37:34
something with as few employees as possible.
00:37:36
Like there there was this period of bloat, but I do think a lot
00:37:41
people are trying to like skip funny rounds.
00:37:42
People are trying to hire fewer people.
00:37:44
Like I do think some of the values of the current AI boom
00:37:47
are in sort of lean companies, obviously.
00:37:51
And do you think the economics will always be?
00:37:54
There for that because I hope so I hope that someone finds a
00:37:57
thing to do with large language models that's actually
00:37:59
profitable while not destroying the environment.
00:38:01
I really there are interesting things Simon Willis and Max Wolf
00:38:05
they post cool things all the time.
00:38:07
I can't do them, can't code, but like, I recognize that there are
00:38:09
people doing stuff for that, but there's this massive other story
00:38:13
I don't ever want to read about fucking agents the rest of my
00:38:16
phase until they can. Until like the ideally early
00:38:19
yes, it still doesn't value like through these things while
00:38:22
they're. Look at the coverage of Open AI.
00:38:24
I mean, is deep research an agent like deep research?
00:38:27
Man, no it's not. Good.
00:38:28
Like a consulting memo. I'm sure you're like, well
00:38:30
consultants are idiots but like if you want to deliver like a
00:38:33
consulting memo level product, you could get it out of deep
00:38:36
research. Can you like cat?
00:38:39
Like is that something like here's the thing.
00:38:41
I've read a few articles about McKinsey being scared of AI.
00:38:44
There's never really any numbers or tangible.
00:38:48
There's always like, well, this is obviously happening.
00:38:50
But when you go and look for tangible examples, in fact,
00:38:53
throughout this whole boom, they're just not there.
00:38:56
And that's my but that's the thing.
00:38:57
If this was being fairly evaluated, great.
00:39:00
It isn't. And I'm roasted like a prize hog
00:39:04
for daring to step outside here, for daring to say this.
00:39:08
But they did not receive this level of scrutiny on the way up.
00:39:11
They have not really received this level of scrutiny.
00:39:14
And if I'm right, there was a bloody reason to and that this
00:39:17
is kind of a giant mirage. And it's scary because putting
00:39:21
aside my personal feelings, the the economic effects here and to
00:39:26
the effects for startups, because the thing I said
00:39:28
earlier, I've set this in the AI as a money trap.
00:39:31
Putting aside feelings about the industry, we currently have a
00:39:33
crisis with valuations. Who is going to buy Cognition?
00:39:36
Who is going to buy Cohere? Who is going to buy Cursor?
00:39:39
Is Cursor going to sell for $20 billion?
00:39:41
OK, fine. Will they actually sell for the
00:39:43
IP though? Who is going to buy these
00:39:45
things? Why did Cognition by the remains
00:39:47
of Windsurf? Was it to add that 782
00:39:51
ARR? Because condition does not have
00:39:53
revenue. What are these companies doing?
00:39:55
What happens next? How do they exit?
00:39:58
Because remember in 2021 a bunch of crap money got tied up in
00:40:01
crapper companies and LP's got pissy.
00:40:04
LP's came flooding back in with AI.
00:40:07
How do you think they're going to feel when there's no exit?
00:40:10
Because there really isn't right now.
00:40:11
Inflection wasn't a real 1 character.
00:40:14
That AI wasn't the shareholders. The investors are getting cash
00:40:17
back because of those. Which investors?
00:40:19
So they are some. Some are, some are.
00:40:20
Investors and companies like Inflection Character, they're
00:40:24
getting money back as part of those deals.
00:40:26
But they they are not getting the massive returns that would
00:40:28
justify like. That's the thing.
00:40:30
It's. Some of them are like 2X and
00:40:32
pretty big amounts of money. I mean it depends on the.
00:40:34
DOX is that really like from the again, if this was just a
00:40:39
regular thing, If this is a regular thing where I was not
00:40:42
having conversations with people that get animated about it.
00:40:45
If it was just a regular stuff something, hell yeah, 2X3X cool.
00:40:49
But the weird thing is, is everyone's talking about this is
00:40:51
the next big thing. When you look at the look at the
00:40:54
actual things. Why didn't these companies get
00:40:56
acquired properly? Why did they do these acqui hire
00:41:00
thingies? That's the financial.
00:41:02
There's a whole, there's a regulatory whole angle.
00:41:04
I will say there is a whole. Regular there's a regulatory.
00:41:06
Angle that happen. Won't regulate like you think.
00:41:09
They're actually Microsoft's inflection deal got put through
00:41:11
regulators anyway, like and it passed.
00:41:13
No, I mean, it did, yeah. It's.
00:41:15
Used to avoid regulation. It it did after the fact, but
00:41:18
the deal sailed through. I mean, you know, it was just
00:41:20
basically a licensing deal. I mean, that's the reason these
00:41:22
things caught fire is because. We did the deal.
00:41:24
These things are legal. Tricks to get around regulation.
00:41:27
Yeah, these aren't the exits people want.
00:41:29
But I'm saying right, people are making money off them and an
00:41:32
open AI is selling like secondary shares for employees
00:41:34
at like 500, like an insane number.
00:41:37
Like yeah, it's like hard to even say.
00:41:41
I'm like, Oh my God, isn't like. Worth this much or so?
00:41:44
But like I think I. Think to make as an argument
00:41:46
here, I think what's going on here with these like kind of,
00:41:48
you know, mild outcomes for the second tier startups is that
00:41:51
they're all kind of predicate on the belief that there is going
00:41:53
to be 1, maybe 2 actual huge outcomes.
00:41:55
And it's probably open AI and it's anthropic.
00:41:57
And if those do, if and if they don't turn out to be anything,
00:42:00
then yeah, the whole thing does. Probably.
00:42:01
I mean, that was a big thing in my benchmark piece right there
00:42:03
that I just published. You know, it's like, oh, they're
00:42:05
making a bunch of infrastructure investments, but they don't have
00:42:07
a big foundation model. And it's very possible there's a
00:42:09
classic home run distribution. But to me that argument is
00:42:12
paired with open AI is going to, you know, win everything in the
00:42:14
Super value. Actually this is a really good
00:42:16
question though. What happens if open AI doesn't
00:42:19
convert to a non profit and doesn't receive the money from
00:42:21
exactly? Well, I agree we are aligned on
00:42:22
that. You know I wrote a.
00:42:23
Bare thing. About open AI predicated on on
00:42:25
that. So I agree with you on that.
00:42:27
Banger And The thing is, I don't actually think we disagree on a
00:42:31
ton of, I think we have, you have a lot of hopes for this,
00:42:34
but it seems like the actual economics, you kind of see where
00:42:36
I'm coming from. And I think I think that it's
00:42:39
good that Someone Like You, it has the bet, has the bull
00:42:41
argument, but still acknowledges these things because whoever's
00:42:45
right, at least these arguments need to be given.
00:42:47
And I will say someone needs to read Softbank's earnings like I
00:42:52
have though, because SoftBank does not remember when SoftBank
00:42:55
said they were going to be spend $3 billion in open AI in back in
00:42:59
February. SoftBank does not know how
00:43:01
that's working. Crystal Intelligence is yet to
00:43:03
pop the whole like there are straight up bullshit things
00:43:07
happening right and. Yeah, we're very interested in
00:43:10
SoftBank Rounder as well. We we agree with some of your
00:43:13
skepticism. But I I have a thing though with
00:43:16
this, which is, putting aside how you think it may or not go,
00:43:19
what happen if Open AI doesn't raise their money?
00:43:21
They're dead. They are dead because they can't
00:43:24
go public. And they also they had.
00:43:26
To you mean if they don't convert?
00:43:27
You mean if they don't convert to APB?
00:43:29
Yeah, if they don't convert, yeah, it's game over.
00:43:31
And I think open AI is a systemic risk because open AI
00:43:34
dies, Cor Weave dies. Crusoe has built nothing for
00:43:37
nothing. Oracle has sunk all this money
00:43:39
and JP Morgan Blue Owl primary data infrastructure has sunk all
00:43:42
this money into data centres for nobody.
00:43:44
They'll maybe resell them. I have some people at Oracle
00:43:47
telling me that they could find customers theoretically, but
00:43:51
this, all of this money is laid out for one company.
00:43:54
And if they don't convert and also SoftBank needs to get $22.5
00:43:58
billion, they don't have that money.
00:44:00
Yes, they, they had a great quarter, whatever, but they
00:44:03
don't have the cash and they have a basic loan LTV thing
00:44:07
where it's like they can't go over 25%.
00:44:10
They don't have even the space to do that.
00:44:12
They're saying it will come out of vision fund.
00:44:14
It's. People may lose them money, but
00:44:16
there is some minimum price for like opening out.
00:44:19
Like people see the value, they're producing revenue.
00:44:21
So I just don't think, I just disagree that there's like some
00:44:23
existential risk here. It's like, yes.
00:44:27
There is. I just, they're, they're not
00:44:28
going to die because like, even if it falls to 30 billion,
00:44:31
because you only get this tiny sliver of it and the nonprofit
00:44:33
owns tons of it, there's still more money that would come in at
00:44:37
some price to keep the thing. I mean, I think as long as it is
00:44:41
the leader in large language models and people are seeing
00:44:43
tremendous value in language models, then yeah.
00:44:46
Which I think the tremendous value.
00:44:49
I think so that's that's the core area.
00:44:51
We disagree that it is the number one app in the App Store
00:44:54
is the main general purpose assistant to find it for me.
00:44:59
But that was one. But that was one this.
00:45:01
One time. It's a sustained #1 this isn't
00:45:03
like a, you know, a podcaster getting the high up once and
00:45:06
using the number over and over again.
00:45:08
This is a sustained in the App Store.
00:45:10
I mean worst case. I will I will say, Eric, like,
00:45:12
like there are there are a warning as someone who covered
00:45:14
Snapchat for many years, like you can, you can be #1 in the
00:45:17
App Store and not really make that great of a business
00:45:19
overall. So these things are somewhat if
00:45:20
every. Customer loses you money,
00:45:22
especially when you're burning money the moment anyone touches
00:45:26
your software. Like that's the thing, gravity
00:45:28
exists so sure you could. Well actually here's the thing.
00:45:32
I get what your point is that if they are the leader they are
00:45:35
perceived as leader. They could keep raising money
00:45:37
but why do people invest money in companies?
00:45:39
It's so they can be an exit. The idea here is without the
00:45:43
nonprofit conversion, open AI has no liquidity route other
00:45:46
than what amounts to a Ponzi scheme.
00:45:48
It just you are selling investor money to other investors at
00:45:52
increasing amounts I imagine maybe.
00:45:54
An element and it's there are now all these secondary
00:45:56
transactions, so it's like private companies.
00:45:58
Or so that's the Ponzi scheme aspect of it.
00:46:00
And I mean, how do you keep that going if they could they be?
00:46:03
Public, but you know, Stripe, all these these companies are
00:46:06
selling there are there are exits involved like new people
00:46:09
are buying old people. So you're just saying that Let's
00:46:12
just say the scheme requires money to keep entering the
00:46:17
system, so that by exiting money that's already entered the
00:46:20
system, and if the money entering the system stops, then
00:46:24
the business will the scheme in this case will die.
00:46:26
That is, it's like if nobody buys, you know, Microsoft
00:46:29
tomorrow, it'll fall to 0. Like this is just how stocks
00:46:32
work, as long as they're. Microsoft makes more money than
00:46:34
they lose billions of dollars. If there are transactions on the
00:46:38
secondary markets where big private companies are able to
00:46:41
sell large chunks of stock to someone else, that's a price
00:46:44
setting. Mechanism.
00:46:46
Are you suggesting the open AI would continue selling stock if
00:46:50
it couldn't go public? Is your suggestion just so I
00:46:52
understand the open AI, they don't go, they don't convert, so
00:46:55
they are perpetually A nonprofit.
00:46:57
You were suggesting that they would survive by just
00:46:59
continually selling stock. No, They need some structure
00:47:03
that's sustainable where they can say this is how it's going
00:47:05
to work. This is where the profits go.
00:47:08
This entity owns this percent of profits.
00:47:10
I'm investing in this. Like, I think there needs to be
00:47:13
some resolution to that question, though.
00:47:15
I think people clearly will have a lot of energy to invest.
00:47:19
Yeah, without that certainty. On the conversion aspect to it,
00:47:22
Ed, the reason I, I have a hard time getting too worked up about
00:47:25
that is that we're not talking about some sort of technical
00:47:27
breakthrough. We're not talking about AGI or,
00:47:30
you know, them achieving some level of, you know, breakthrough
00:47:32
it in the language model. This is a political bureaucratic
00:47:36
transaction, right? This is about difficult it is to
00:47:39
convert. Sure, I've written lots of
00:47:40
stories about this. I've written lots of stories
00:47:42
about this. I mean, essentially their entire
00:47:44
fate rests in the hands of the AG in Delaware and California
00:47:47
and to a smaller. Degree the IRS.
00:47:50
Well, the IRS is, is downstream of all of this.
00:47:52
I mean, this is mostly an AG decision and also Microsoft
00:47:55
because it's about their larger contract over their, you know,
00:47:57
relationship with open AI, right.
00:47:59
I think Microsoft has a vested interest in ensuring that open
00:48:02
AI doesn't collapse because their stock price is based on.
00:48:04
Yeah, because their stock price is.
00:48:05
Based on AP, they own all the infrastructure they.
00:48:07
Just, but they also get, they also get cloud revenue from
00:48:09
usage of Chachi BT I mean, if, if open AI disappear tomorrow,
00:48:12
their story of an AI being an AI stock would, would, would
00:48:15
evaporate overnight. They can't sell copilot.
00:48:17
You know, they, they, the Mustafa's team hasn't built
00:48:19
shit. So they absolutely need open AI
00:48:21
to exist. But what, what at least for the
00:48:25
next couple of years. But look, I don't even know.
00:48:28
I, I can't pretend I know how that one's going to going to
00:48:30
play out on the, on the California AG front.
00:48:33
I think there's going to be a huge fight with the nonprofits
00:48:35
within the state and who even knows, you know, who becomes the
00:48:38
next governor of California? If that's Katie Porter, if it's
00:48:40
I don't even know who else not not Kamala Harris.
00:48:43
You know, I think one of the major things that will come up
00:48:45
is how hard are you going to be on open AI?
00:48:48
I think California's as a anti business state is something very
00:48:51
problematic to people here. And the idea that you would have
00:48:54
fucked over the most promising company by not allowing to
00:48:56
convert into a for profit would be a huge problem.
00:49:00
That. I mean actually we do that many.
00:49:01
Votes that you just like get big enough and like the regulators
00:49:05
like solve the problem problem for you and.
00:49:06
I can't get that worked. Up even though I'm somewhat
00:49:09
cynical about it. Same thing will happen for open
00:49:12
AI. I would make I.
00:49:14
Disagree with you, but I can see how you got there.
00:49:17
I wanna make a sort of different point on the sort of
00:49:19
overvaluation of AI startups that again parallels this like
00:49:22
Uber trauma experience. It's just like SoftBank overpaid
00:49:28
for Uber, which effectively meant a bunch of Middle Eastern
00:49:30
money and Saudi money overpaid for Uber.
00:49:33
The valuation corrected and eventually went back up.
00:49:35
Like a lot of the money that's flowing in now is these big like
00:49:39
Middle Eastern sovereign wealth. And like some of my big question
00:49:42
to you like what is the more productive thing that this money
00:49:45
could go to? Like fundamentally what's
00:49:47
happening is countries have made a lot of money on oil.
00:49:50
They don't have any better ideas about what to do with the money.
00:49:53
So they throw it at AI and that's a.
00:49:55
Hard. Line.
00:49:56
Against the line. Like what I?
00:49:59
Said There's a lot of very interesting cities that you
00:50:01
could be building in Abu Dhabi and, and, and Riyadh right now.
00:50:04
But I think you're overlooking. For a long quab is like mega
00:50:07
city infrastructure projects, but they're they're basically
00:50:10
saying, OK, the most productive thing going is an AI.
00:50:13
And so worst case scenario, like America is getting a bunch of
00:50:16
mispriced money, oil money into our technology companies.
00:50:21
And maybe it's slightly overpriced, but I, I just don't
00:50:23
think the money is being burned. Like it's largely American money
00:50:26
is being marked up by oil money. And like, what's the tragedy
00:50:30
from the American perspective? First of all, I think it's not
00:50:33
an argument to say what else this should be spent on
00:50:36
considering I already established that they're out of
00:50:38
ideas. I do not know if there is
00:50:39
another hyper growth market. I believe venture capital needs
00:50:42
to actually look downstream and invest more money and seed and
00:50:46
startups in general. Most of the money goes B and up
00:50:49
if not C and up. It has historically done that.
00:50:51
It's fucking stupid and antithetical to what Silicon
00:50:54
Valley stands for. On top of that, I don't agree
00:50:57
that most of this money is oil money.
00:50:58
In fact, a remarkable amount of this money is coming through
00:51:01
sources like Magnetar Capital or NVIDIA or Microsoft or Amazon.
00:51:06
Right, There's a lot of money. Flowing back Andreessen Horowitz
00:51:09
back to them, right? That's right.
00:51:11
Okay, so you're just saying, So what is the thing you're saying
00:51:15
that it's good this money's No, I'm saying like.
00:51:17
This I'm saying valuation, even if these companies were all
00:51:20
overvalued by 50%, I don't think it's like a catastrophe like.
00:51:25
Either I think it is a. With huge sums of money that
00:51:28
don't know where else they want to invest it.
00:51:30
It's companies that benefit from the money coming back to them.
00:51:34
Like you're saying like I agree with what you're saying.
00:51:36
I just, I don't think that's Catas.
00:51:38
There's a catastrophe. Evaluations fall so.
00:51:41
I know, I know we're approaching time.
00:51:43
So I think I can actually answer this fairly rigorously.
00:51:45
So the catastrophe is the following in 2021 there was way
00:51:49
too much money put in crap LP's got scared, start up money
00:51:52
really slowed. It came back, it came back
00:51:55
thanks to AI. The problem is that I actually I
00:51:57
think the money invested here is stupider and the promises are
00:52:00
more egregious, Right? Hear me out.
00:52:02
No. No, no.
00:52:03
Yeah, You're in this. This is a good ending, Yeah.
00:52:06
Your feelings about what will won't happen.
00:52:08
I don't think these companies are worth enough.
00:52:11
I don't think that. I think they have the exit
00:52:13
liquidity and think they're sustainable.
00:52:15
As a result, a lot of venture capital is going to get burned,
00:52:18
which means a lot of LP's are going to get burned, which is
00:52:21
going to largely stymie the amount of money going into the
00:52:23
startup ecosystem. On top of that, if NVIDIA has
00:52:27
any problems with growth from here to oblivion, because they
00:52:29
have to, it needs to be like what, 50 billion next quarter in
00:52:33
data center revenue? Even though data center revenue
00:52:35
missed estimates last quarter, it needs to keep going what in
00:52:38
three years we meant to be selling $120 billion of GPU's
00:52:41
because that's what the market expects.
00:52:43
All of these expectations have become load bearing.
00:52:46
They become load bearing for Silicon Valley and they become
00:52:49
load bearing for the markets. The problem is there is no way
00:52:52
to match these expectations. So at some point, there is going
00:52:56
to be a nasty hangover. And my fear is it's going to hit
00:52:59
regular people in the markets, a ton of retail money in NVIDIA
00:53:04
too. And on top of that, it's going
00:53:05
to stop start-ups getting money in general.
00:53:08
And I think, you know, I think we need more money in angels
00:53:10
anyway. I've always, always felt that
00:53:12
way. It's like, I think that it's
00:53:14
better to invest in earlier companies than otherwise.
00:53:17
But I'm worried now that we're going to have a repeat of
00:53:20
2022-2023 ass layoffs. Fuck all money for startups.
00:53:24
A lot of cynicism, and I don't mean cynicism like me, I mean
00:53:27
just straight up, I don't want to put money in tech and when
00:53:30
tech doesn't have a hyper growth thing after AI, I'm terrified
00:53:33
that it's going to just start slashing tech valuations across
00:53:36
the board. However you feel about
00:53:37
generative AI, my arguments are founded on like real things.
00:53:41
And I mentioned cognition, for example, the evaluation.
00:53:44
It's like, how's that? And how does any of this?
00:53:46
And does open AI really go public?
00:53:48
Does anthropic go public? Do you think that their S 1 is
00:53:51
going to look like anything other than a Jackson Pollock
00:53:53
painting? Like it's, it's rough out there
00:53:56
and I'm actually, I don't know how people are so bearish right
00:54:00
now. I don't know how I get why
00:54:02
people aren't like completely insane and swagged up like me,
00:54:06
but I don't, I don't get why more people aren't worried about
00:54:11
it because this doesn't even match previous things.
00:54:14
And I'm just, I have real concerns and the people who are
00:54:17
going to get hurt are founders and real people and who have
00:54:20
invested in the market. It's got their downstream
00:54:23
effects are rough. The Silicon Valley consensus is
00:54:25
that Angel and seed rounds are overvalued.
00:54:28
Now, maybe you disagree, but it's just funny to me to hear it
00:54:30
from you. I think it's.
00:54:32
I think it's. Far more, I think it's far more
00:54:35
reasonable to overvalue at the seed other than fucking thinking
00:54:39
machines. And that's a super safe
00:54:41
intelligence. I think those are ludicrous.
00:54:43
But I think it's OK to take bets at the seed and the Series A
00:54:46
stage. That's that's fine.
00:54:48
Like, come on. Like, yeah, you take the piss a
00:54:50
bit there because you have hopes and you get to these overblown
00:54:53
valuations. Open AI stage, anthropic stage.
00:54:56
They should be sustainable and I think that's a reasonable thing
00:54:58
to ask for. All right.
00:54:59
Well, look, I mean, the beautiful thing about where
00:55:01
we're at right now is that we don't know who's right.
00:55:04
We have no idea what's going to happen.
00:55:06
The future remains. To be written.
00:55:08
Yeah, if it is written, it could be written by AI and we, we know
00:55:11
that it's very sycophant, think it will make us feel very smart
00:55:14
about what we asked it. But Ed, thank you so much for
00:55:16
joining. I have a feeling we should do
00:55:18
this more often. I would love to thanks for I
00:55:21
know this has been quite I've really had a good time.
00:55:23
No, we. Coming a little.
00:55:26
Bit I actually pay for new coat like I have quite sometimes.
00:55:28
We enjoy your sub stack. So yeah, thank you very much.
00:55:31
Thank. You so much.
00:55:32
Yeah. All right.
00:55:33
Thanks.
