Is the AI boom already peaking? In this episode of The Newcomer Podcast, Eric, Madeline and Tom take a hard look at the hype cycle driving Silicon Valley’s latest gold rush — from Andreessen Horowitz’s record-breaking $25 billion year to Amazon’s push to automate its entire workforce.We explore whether AI’s trillion-dollar promise is real innovation, or if the cracks are already showing. From OpenAI’s overblown math claims to Andrej Karpathy’s “State of the Union” reflections, we break down what’s really happening behind the headlines.🎙️Topics in this episode:- Andreessen Horowitz’s $25B AI windfall: how they pulled it off- Amazon’s automation future: are human jobs at risk?- Why AI hype might be masking stagnation in real progress- Who’s actually profiting from the AI boom (and who’s not)- The coming “AI bubble” — is it about to burst?
00:00:00
Is AI not nearly as powerful as some companies would lead us to
00:00:03
believe? We're recording this episode hot
00:00:05
on the heels of one of Andre Carpathi's AI State of the Union
00:00:08
interviews. I'm only sounding pessimistic
00:00:10
because when I go on my Twitter timeline, I see all this stuff
00:00:13
that makes no sense to me. It has us reflecting on the
00:00:16
vastly different takes people are having about AI.
00:00:18
On the one hand, you have the New York Times reporting that
00:00:20
America's biggest employer is moving towards complete
00:00:23
automation, while on the other, we're seeing AI failing to
00:00:26
impress. In today's episode, we want to
00:00:27
look deeper into the power behind the potential a bubble
00:00:30
we're seeing. But first, we'll be discussing a
00:00:32
recent article written by Madeline Renberger in which she
00:00:35
takes a deep dive into how Andreessen Horowitz are making
00:00:37
off like such bandits. We'll look into the results
00:00:40
they're expecting to end the year with and what moves they
00:00:42
made to get nearly $25 billion in gains since its founding.
00:00:46
This is the newcomer podcast. All right, Madeline, we've been
00:00:58
diving deep into Andreessen's numbers over the last couple of
00:01:03
weeks on Newcomer. And couple weeks ago we had a
00:01:07
big story with a lot of their kind of unrealized and realized
00:01:10
returns. But the question that came out
00:01:12
of that story was like, but really how much was distributed
00:01:15
back-to-back to partner, sorry, back to investors, which we now
00:01:19
have a much more clear read on after the story that you and
00:01:22
Eric did so. This is real venture nerddom.
00:01:25
You know they absolute well. How is Andreessen doing?
00:01:28
Yeah, you can see markups, but then, you know, we get feedback
00:01:31
from our readers like, but what's the actual cash?
00:01:34
How does they? Mean the real Yeah.
00:01:35
What's a reasonable thing to want to know more about?
00:01:37
I mean, this is always the question about venture, right?
00:01:39
Especially in an era where, you know, there's not a lot of IP OS
00:01:43
and you know, actual cash returns are like a big question
00:01:46
for all of these companies. So what do we, what do we got in
00:01:49
Andreessen? What did we, what did we learn
00:01:50
from your piece? Yeah.
00:01:52
So it's really a tale of where they made money kind of lands
00:01:56
into two buckets. There's all of the funds pre
00:01:59
2014. So their first few funds, those
00:02:02
have done incredibly well. And post 2014 crypto has done
00:02:07
incredibly well. And that's basically the story
00:02:10
of Andreessen's actual cash back to investors.
00:02:13
I would say, you know, the first fund was spectacular.
00:02:16
It has returned as this was of course, as of end of 2024.
00:02:20
But they had returned 6X on the investment to investors from
00:02:25
their first fund. That included their Skype exit,
00:02:28
a couple other, you know, really big deals.
00:02:30
But like it's that they really showed their money there.
00:02:33
Like that first fund out of the gate.
00:02:35
Super impressive, kind of set their reputation.
00:02:37
Always good. If you start a venture firm, you
00:02:39
want to, you want to come out of the gate strong, and they did
00:02:41
that. And you should subscribe to
00:02:42
newcomers. You can see the whole chart.
00:02:44
We have all the numbers. You can follow along.
00:02:46
And I think one thing I'm proud of in this story is that we got
00:02:50
sort of the Cambridge Associates benchmarking data.
00:02:52
So you not only see how Andreessen's doing, but at the
00:02:55
same time what was sort of the top quartile and top 5% DPI
00:03:01
distribute distributed to painting capital for venture.
00:03:05
And you know, in this first one, Madeline's talking about
00:03:08
Andreessen is 6X DPI at the time of this, you know, September
00:03:14
2024. And then top quartile is 2.37 X
00:03:19
and then top five percent is 3.98 X.
00:03:23
So by any measure that first fund that came out of the gate
00:03:26
strong. The second fund, they're in the
00:03:29
top quartile, but not the top 5%.
00:03:32
The third fund they're in both the top quartile and top 5%.
00:03:36
So that's that's the early days. Yeah, the early days, super
00:03:39
impressive and yeah, they're basically like in the one, I
00:03:42
won't go into direct averages, but like the very top of
00:03:45
performance. These first funds, which is what
00:03:47
you obviously want to see going forward.
00:03:50
Obviously the closer we get to the present, you're not going to
00:03:53
see as big of returns, right. But even their, you know, core
00:03:57
2016 fund which is almost 10 years old has a DPI of just .3.
00:04:02
So we've gone from 6X on the first fund to .3 where you
00:04:06
really should start seeing cash. So that one has really
00:04:10
underperformed. I mean, it's below the median,
00:04:12
so it's not even the top half of performance.
00:04:15
With and those funds are bigger. I think a key thing with the
00:04:18
Andreessen story is that they get more and more money, right?
00:04:20
It's like they do well in the beginning and then they're like,
00:04:22
screw, like the benchmark model and this small model of venture
00:04:25
capital, we're going to get huge and we're going to raise a bunch
00:04:27
of money. And so part of what you want to
00:04:30
know as somebody covering Andreessen Horowitz, the, you
00:04:32
know, the most prolific investor now in Silicon Valley, founded
00:04:37
in 2009, you know, run by Marc Andreessen and Ben Horowitz,
00:04:41
this epic venture capital firm. You want to know it's like, OK,
00:04:44
they did well in his small venture capital.
00:04:47
Are they going to do well as a huge sort of money manager?
00:04:50
And those funds are not performing.
00:04:53
You know, the 2016 fund is worse than what you're saying, even
00:04:56
the median, right? So that is, yeah, it's not doing
00:04:59
great, right. And then a lot of venture,
00:05:02
Andreessen or or otherwise hasn't really returned a lot of
00:05:05
money in, you know, the last nine years.
00:05:08
I don't. Know I was going to say even as
00:05:10
you to your point, as they've gotten bigger, they've expanded
00:05:12
into other verticals. And so they have these biotech
00:05:15
specific funds and obviously we're kind of in a biotech
00:05:17
winter right now, but those have really underperformed even just
00:05:21
the rest of the Andreessen portfolio, they're not doing
00:05:24
well. I feel like, you know, they're
00:05:26
capital halls that they've, you know, raised multiple bio funds,
00:05:28
but even the first one is just not doing as well as the rest of
00:05:32
these. But I will say the big asterisks
00:05:34
and the big saving grace for these later funds for Andreessen
00:05:38
is their crypto funds. The crypto funds, which began in
00:05:41
2018, have just done just so much better than all of these
00:05:46
other recent funds, and they show cash back to LP's faster
00:05:50
than the typical venture cycle. Flipping tokens, you get you
00:05:55
invest equity and the you know, the businesses, but then you
00:05:58
also get these tokens and you can sell them.
00:06:00
I will also in our first and recent story, which you should
00:06:02
also subscribe to the newsletter to read, we had they're making
00:06:06
this staking revenue, which I only half understand.
00:06:08
I'm sure some crypto die hard will listen to this and be like,
00:06:11
duh, staking. But you know, they're they have
00:06:14
money riding in these crypto projects and they they throw off
00:06:16
millions of dollars of cash as an investment firm, just, you
00:06:19
know, holding tokens and getting staking income.
00:06:22
So they're, yeah, they're making a lot on crypto and it's it's
00:06:26
been a boon for the firm. LP, though, how excited are you
00:06:31
to be like basically getting paid back via crypto like token
00:06:35
flipping? I guess like money is money and
00:06:37
if you're getting what you want back, you're fine to a degree,
00:06:40
But it doesn't seem like you're really fully indexed like across
00:06:42
the tech, you know, ecosystem the way you might want to be if
00:06:45
you're just getting like, you know, basically coin flipped.
00:06:48
You couldn't get more index than Andreessen Horowitz.
00:06:51
I don't know. I reading this.
00:06:54
You know, if I was an LP, would I deploy in Andreessen?
00:06:58
I wouldn't want them to bully me into bio.
00:07:00
I guess that would be. But yeah, I think you get a
00:07:03
broad coverage of venture capital and private technology
00:07:08
for a fund that has proven that they can get sort of top access
00:07:12
and has top performance. And then you just worry that
00:07:16
this, I think you have to make this sort of question is like
00:07:19
is, has the technology business gotten so much bigger to justify
00:07:22
these much larger funds? And like, that's still playing
00:07:25
out. And that's the question.
00:07:27
You know, firms like CO2 and Tiger Global, you know, they
00:07:31
raise enormous funds and sort of the pandemic, everything's
00:07:36
digital frenzy of 2021, late 2020.
00:07:41
And those funds haven't done very well.
00:07:44
And so it seemed like, oh, we didn't need all that money.
00:07:46
Andreessen has sort of kept kept sort of growing.
00:07:50
And so I think there's this question about sort of large
00:07:53
private investing. But I do think on a relative
00:07:56
basis, Andreessen has done pretty well.
00:07:59
Right. It's too early to.
00:08:00
I was going to say, yeah, I agree.
00:08:01
I think for the early funds, I certainly would have wanted to
00:08:04
invest in those funds. What the big asterisks in the
00:08:07
big test is that it's too soon to have any measurable DPI or
00:08:12
cash back from these post 2020 funds.
00:08:15
And that's when everything ballooned.
00:08:16
So to your point, these mega funds, you know, off of the
00:08:20
success of these early deals, they're raising more and more
00:08:22
capital and they've raised, you know, they closed like a multi
00:08:25
billion dollar crypto fund. This first crypto fund that
00:08:27
returned super well was in the hundreds of millions.
00:08:30
So the scale of can they replicate these impressive even
00:08:33
token flipping returns, you know, well is remains to be
00:08:37
seen. And does that mean you'd go buy
00:08:40
in now? I don't know.
00:08:41
But they are in every major tech deal.
00:08:43
They do have the access which is part of the buy in right?
00:08:45
And I, I do think there's value if I'm a limited partner to, you
00:08:50
know, getting visibility like data bricks becomes a key
00:08:54
company for Andreessen Horowitz. If I'm a limited partner in
00:08:57
Andreessen, they keep backing up the truck into data bricks.
00:09:01
Maybe one day I say, oh, I should, you know, if I'm a super
00:09:04
wealthy, you know, sovereign wealth fund or if I'm a
00:09:07
university, I say, oh, we'd like to jump in and do our own, you
00:09:11
know, investment following on. So, so you get visibility into
00:09:15
what's coming up. And then you can try, if you're
00:09:17
savvy to sort of jump into winners yourself with the Intel
00:09:21
you get. So they're non monetary reasons
00:09:25
to invest. But but yeah, I think, you know,
00:09:28
the days of Journal, I mean, Andreessen Horowitz comes on the
00:09:31
scene in 2009 really loud and says like, you know, tech is
00:09:36
much bigger, this whole world's going to get much bigger.
00:09:38
And there are a bunch of the reporter class that was worried
00:09:41
like bubble, bubble, bubble. And I think they were right.
00:09:44
Like 22 to 2016 was a great time to invest funds did well,
00:09:49
there is money to be made. And like a lot of the bubble
00:09:53
concerns were misguided. And so I think, yeah, looking
00:09:58
back on that period of worry on Twitter and, you know, just
00:10:02
among sort of the chattering classes and tech observers, I
00:10:05
think Andreessen was pretty right and the naysayers were
00:10:10
were wrong. I don't know, Tom, what it as
00:10:13
somebody who sort of watched the discourse, what's your?
00:10:16
Take on that. Give us the discourse, yeah.
00:10:18
Look, I, I think Andreessen, I think loud is the right word to
00:10:21
use, right? I mean, they came onto the scene
00:10:23
and and this was Marc Andreessen's software as eating
00:10:25
the world declaration, which did prove for a time to be
00:10:30
incredibly accurate. And sure, I mean, we talked in a
00:10:33
previous episode about some of their key investments that
00:10:35
worked out for them. There was Airbnb, a lot of the
00:10:38
partners, by the way, they were behind these big investments are
00:10:40
not there or the kind of not active with entries, which I
00:10:43
think is worth pointing out to me.
00:10:46
What's matters now, it's like we're talking about like you're
00:10:49
saying, funds that are over a decade old now.
00:10:51
So, so, and this just sort of is where my mind is going
00:10:55
constantly with all of these funds right now with AI is like
00:10:58
how strong is your portfolio within AI?
00:11:02
And if you look at, you know, the spectrum of their
00:11:04
investments that they have right now, correct me if I'm wrong
00:11:06
here, but the banner ones that I see from Andreessen right now is
00:11:09
they are an open AI. So they do have some, they do
00:11:12
have some, some equity there, whatever.
00:11:13
But they weren't like a core early.
00:11:15
They weren't like so right about opening AI and in fact their
00:11:17
crypto obsession blinded them a little I think to some of the AI
00:11:21
stuff like they pivoted loudly, but I think they there's a
00:11:24
little like, you know, we're decentralized and AI is
00:11:28
centralized and there so there's this if you're close, close
00:11:31
Swatcher, there is a little hesitation I think from the firm
00:11:33
before. They were late, but they've
00:11:35
bought up a pretty significant holding at this point.
00:11:39
They can make a strong case to their LP's that they have
00:11:42
exposure to to open AI. That should make them feel
00:11:44
pretty good. Right, and they're in, I think
00:11:47
they're big in XAI, they're huge thinking machines like, and
00:11:50
they're in, you know, a lot of, I think like Replit, they're in
00:11:53
Replit, you know, which is coding application type company
00:11:56
that will speak at Cerebral Valley.
00:11:57
But you know, they were in that, you know, several pivot or
00:12:01
several moods ago for, for the company.
00:12:04
So they, they have a, you know, portfolio of very recognizable
00:12:08
names that are in the Silicon Valley zeitgeist where you look
00:12:11
at, you know, some of these other firms, you know, if no,
00:12:15
no, I mean Sequoia. I'm, I'm more saying like
00:12:17
General Castle and Lightspeed, Like if you look through their
00:12:19
decks, which we have also published in Newcomer, the AI
00:12:23
names are a bit more random. And I think Andreessen's AI
00:12:27
investments are names that you would know partially because
00:12:31
Andreessen does a good job of making sure we hear them.
00:12:33
We do know that I actually think lightspeed is OK.
00:12:35
They're a little, there's a little bit of randomness with
00:12:38
them, but they're, you know, they're big in in they're an
00:12:41
anthropic. They they led some previous
00:12:42
rounds there so they can. And they're and they're like
00:12:45
their business. They're good at like real
00:12:46
business. So often it's like, Oh yeah, we
00:12:48
haven't heard of them because they're wonky going to make
00:12:50
money. Whereas the Andreessen companies
00:12:52
are a little bit more like in in the zeitgeist, I think.
00:12:54
Yeah. Lightspeed I think is in like
00:12:55
Glean and all other a couple of these other like enterprise
00:12:58
tools that are like doing quite well, yeah.
00:13:00
Actually, Eric, I'm interested in your thoughts here because
00:13:03
you have like a good kind of, you know, higher altitude view
00:13:06
of maybe where VC is trending. But do you think in some ways
00:13:10
Thrive has kind of taken the mantle of like the loudest, most
00:13:13
banner company oriented firm where it seems like they
00:13:16
basically go very heavily into things like open AI and in a
00:13:20
much more public way and probably more money than than
00:13:23
Andreessen has. They're in Stripe, obviously.
00:13:26
They've just done a good job of grabbing drives.
00:13:27
Crushies. I mean, I think like you, there
00:13:31
are in some ways like the benchmark of today.
00:13:35
They're not like Benchmark in that they have tons of money,
00:13:38
but like Benchmark in that they're small.
00:13:41
You get to work with the actual people.
00:13:43
There aren't a lot of like Andreessen is like who are all
00:13:45
these people? Like your names aren't on the
00:13:46
firm. Like I feel like Thrive is
00:13:48
small. You're working with somebody.
00:13:50
There's a sense that like you're working with the firm, like
00:13:52
you've got at it and like, you know, that, you know, Josh looms
00:13:55
super large there and Thrive has been able to pull in money from
00:13:59
everywhere. Yeah.
00:14:00
So and you know, I sort of teased them in one of the
00:14:03
newsletters that they were like the top tick investor of the
00:14:06
year. I forget what year that was, but
00:14:08
they've been, you know, I don't some random deals, but like
00:14:10
click house and there's just a lot of things where they were
00:14:13
saying, oh, this stuff is going up and AI has gone up.
00:14:15
You know, they they were bullish correctly.
00:14:18
And again, I think a lot of being in VC is being bullish,
00:14:23
you know, at the right time, you know?
00:14:26
They've also kind of taken the mantle of being like the string
00:14:29
pullers within tech, too. I mean, I don't know how much
00:14:31
you want to go into it on this episode.
00:14:33
And, you know, we've also been out.
00:14:35
We should tell our listeners that Eric's been gone for a
00:14:36
while on paternity leave. So what he knows about, you
00:14:40
know, anything happening right now is time he's taking away
00:14:42
from his new daughter. But you know, Thrive got this
00:14:47
very gauzy profile published in a magazine that I had, not
00:14:52
Colossus, which I I I don't really know exactly what it is.
00:14:55
Seems like some sort of a industry trade magazine that
00:14:58
publishes puff pieces on. It's very much in my media
00:15:01
entrepreneur radar. I mean, Patrick O'Shaughnessy
00:15:04
has a great podcast and he's built out a sort of media
00:15:07
company. Yeah.
00:15:09
Yeah. The article is very glossy,
00:15:11
though. Yeah.
00:15:11
Yeah, and, and, and, you know, I, I encourage people to try to
00:15:14
read the whole thing because I couldn't do it.
00:15:17
But, you know, the fact that this was, you know, and, and
00:15:20
they, they're very difficult to to profile as, as a fund and,
00:15:23
and they're, you know, jobs. They're definitely precious.
00:15:26
They're very precious, yeah. They they don't give a lot of
00:15:29
media openings, if you will. Yeah, and but to be clear, I can
00:15:33
say this on the show, I like a lot of the people there and and
00:15:35
when I've talked to them, they're they're very friendly.
00:15:37
It's not like they're a mysterious cabal of people.
00:15:39
They just are pick and choosey about their publicity.
00:15:43
But they kind of by doing this Colossus profile, they sort of
00:15:47
did the Andreessen thing, you know, like going direct, like
00:15:49
they kind of actualized it. Like whereas Andreessen spent so
00:15:51
much time building up a media entity.
00:15:53
I understand that was something that they owned.
00:15:55
But all the attention and momentum has headed, like we
00:15:58
say, toward thrive right now. And they're also getting really
00:16:01
nice articles written about at least Josh and the fund.
00:16:05
And to me, it's like you, you see them owning the moment so
00:16:08
much better than almost any other fund right now.
00:16:11
And. I wonder where somebody
00:16:12
Andreessen, to me, still has done the best on media.
00:16:15
Like, I mean, they're a machine. They're huge.
00:16:17
They're sort of synonymous with technology and venture capital.
00:16:23
It'll, it'll be interesting. I mean, you know, they're Mark
00:16:27
and Ben are still very active. So they have that going for
00:16:29
them. They have a podcast, they're out
00:16:31
there. You said earlier, like some of
00:16:33
the people who've done the key and recent investments you were
00:16:35
talking about like Jeff Jordan with Airbnb.
00:16:38
But I think for the most part, a lot of them are there.
00:16:40
I mean, Chris Dixon is the crypto guy.
00:16:42
He did Coinbase. That was key like Ben did data
00:16:46
bricks. Like I don't know a lot.
00:16:48
Some of the key investments they are still around the hoop, but
00:16:52
But Dr. Andreessen yeah, they're both they're both strong.
00:16:55
I'm not I'm not I don't think this like Colossus profile of
00:16:59
Josh. It like moves the needle that
00:17:02
much. I don't know I I think it's good
00:17:04
like Colossus like I think it's cool that they like prioritize
00:17:07
writing and like people can be pro entrepreneur.
00:17:10
I think we're in this era where it's like all the journalistic
00:17:14
defensiveness about like a sort of captured media.
00:17:18
I don't know, it's everywhere. Yeah, we talked about that a lot
00:17:22
last week with Mike Solana. I, I guess the issue with
00:17:26
Colossus was more, less the profile itself, which like, you
00:17:29
know, your mileage may vary on how much you like that style of
00:17:32
writing, but it was more like the declaration that like you
00:17:34
don't need media anymore and you know, there was a.
00:17:39
But I just think there's a sense among some of the text set and I
00:17:42
think arena, which is another one of these publications.
00:17:46
They're like, oh, the great thing about magazine is it's
00:17:48
like it's glossy. It's like somebody's smart was
00:17:52
paid to write something and then your picture is there and that's
00:17:55
that's it. Like, you know, it's like the
00:17:57
text is like, well, well composed.
00:17:59
I don't I don't know. I feel like they miss the fact
00:18:02
that it's like that somebody truly independent, that somebody
00:18:05
is hard to persuade was persuaded that you're like cool
00:18:09
and good. That's that's to me what what
00:18:12
matters in a lot of these media, like marketing media sort of
00:18:15
entities. They they give you the Sheen.
00:18:18
They're like, oh, it's the good writing.
00:18:19
It's glossy, like you're great because somebody who's totally
00:18:22
captured and is destined to say positive things about you said
00:18:25
something positive. Like I, I just don't think
00:18:27
that's going to have people know who's like independent and whose
00:18:31
voice is like, oh, wow, you had, I mean, you know, that's
00:18:33
certainly what I'm trying to create here, which is like,
00:18:37
we're, we say what we think, we criticize people when we're
00:18:40
persuaded. You know, Andreessen and I,
00:18:41
we've thought about tons of stuff.
00:18:42
They freeze me out all the time. Like, but I went and we say, oh,
00:18:46
the performance looks pretty good.
00:18:47
Like the, the idea is that should carry some weight because
00:18:50
it's like, there's this whole history.
00:18:51
I have plenty of reasons just to needle them, you know what I
00:18:54
mean? Whereas when you're getting this
00:18:56
glowing profile from sort of an entity that's interested in
00:19:01
writing positive things about entrepreneurs and investors, how
00:19:06
much weight does that carry? Yeah, you're talking, I mean,
00:19:08
it's kind of style over substance, right?
00:19:10
Like the depth of where this is coming from.
00:19:14
It has all the trappings of, you know, a glowing magazine profile
00:19:17
with none of the actual gravitas behind that of, you know, the
00:19:21
solid reporting backing up the. Reputation and none of the oh,
00:19:24
this could have been a really negative piece and like they
00:19:27
were they like them, you know, you need a little bit of that.
00:19:30
It could have gone in different ways, right?
00:19:32
Right. Yeah, especially with, you know,
00:19:34
a successful firm or and a rich person who's married to a
00:19:37
supermodel. Like, you do kind of need to
00:19:39
overcome the inherent skepticism of a reader to make a better
00:19:43
case as to why you care about them.
00:19:45
If it's an audience that's already like, man, these guys
00:19:47
are great and look how great they are.
00:19:49
They they're telling me, I will say journalistically as someone
00:19:51
who covered, because one of the elements of the story of Thrive
00:19:54
was the role that Josh Kushner played in, you know, Sam
00:19:57
Altman's ouster from Open AI, exactly how much string pulling
00:20:01
he did to get him back on board. I had some qualms with the way
00:20:05
it was described and I understand everyone is the hero
00:20:07
of their own narrative and right, you know, even though
00:20:11
it's. Funny that it's the same moment
00:20:12
Jared Kushner is clearly trying to get tons of credit for this
00:20:15
Israel deal. Both the Kushner's are like
00:20:18
savvy and that Jared's getting in the New York Times in places.
00:20:21
But both are clearly like moments of crisis.
00:20:24
We're we're here. Yes, you could really trace the
00:20:26
Israel Gaza deal back to the time in which Josh Kushner
00:20:29
personally got Sam reinstalled back.
00:20:32
That's the head of opening the eye and you're like, well, I got
00:20:34
to top this somehow. What's what's on my list?
00:20:37
Oh, wait. Peace in the Middle East.
00:20:39
Yeah, yeah, TBD on that one, by the way.
00:20:42
But anyway, look, I, I by the way, the ratings apparently go
00:20:46
down on the podcast when we talk too much about media.
00:20:48
So I don't want to. I don't want to.
00:20:49
Dwell too much on this one. I did it.
00:20:51
I'm sorry. It's my fault.
00:20:52
We can't help ourselves. It's navel gazing.
00:20:55
But but Andreessen look, I think the AI story is is still is
00:20:58
still like everyone else is still to be told they have some,
00:21:01
you know, exposure to a lot of top companies.
00:21:04
I'm interested to see where the XAX AI1 in particular will go.
00:21:08
But you know they. Lost Igor Babushkin, who was
00:21:12
like one of their key executives.
00:21:13
We're having one of their Co founders speak at cerebral
00:21:17
valley on November 12th coming up so I'm we'll we'll be excited
00:21:20
to hear how things are going XAI on stage.
00:21:25
But yeah, it's a, it's a company that I, you know, I think people
00:21:30
had underestimated and I was like.
00:21:33
I did I say it? Big things are coming and they
00:21:36
they caught up, but then I feel like, you know, we need to see
00:21:40
like revenue and stuff like that come out of the company.
00:21:43
Yeah, anyway, we'd love to do more stories on other VC funds
00:21:47
returns, TVPI and DPI. Please keep sending them our way
00:21:51
and. Subscribe and you know our
00:21:53
readers and our listeners, you guys are in the business.
00:21:55
You know as much about this as we do.
00:21:58
Please send me you know well your commentary is anonymous.
00:22:01
You know, if you say off the record, I'll take it off the
00:22:03
record. Please send me your observations
00:22:06
on what what you think the return where you think the
00:22:09
returns are strong and weak would be eager to hear from you.
00:22:12
We're just, we're all at newcomer.co, Eric.
00:22:15
Yeah, we all have takes. You know, yeah, feel free to
00:22:18
weigh in. Speaking of takes, let's check
00:22:22
in on the state of hype within AI and where we're at in the
00:22:25
cycle right now. It's been an interesting couple
00:22:29
of weeks on that front. So we had Andre Carpathy
00:22:33
appearing on the Dwarkesh podcast giving his, you know,
00:22:38
fairly clear eyed. I wouldn't say skeptical or
00:22:43
negative takes, but it definitely got a lot of people
00:22:45
in the industry trying to either defend AI or claiming that they
00:22:49
knew all this whole time that it was overhyped and not quite it
00:22:53
was promised to be. It's an interesting piece also
00:22:55
in the New York Times about Amazon and their plans to
00:22:58
automate a lot of their back of house processes, which is of
00:23:03
course, you know, robotics, but but AI as well.
00:23:06
And then a lot of stuff going on with open AI.
00:23:09
And I mean, they're just a non-stop hype train over there
00:23:12
with. There's a browser claiming.
00:23:14
Have you tried any kind? Of browser they're they're
00:23:18
claiming that they're solving unsolvable math problems.
00:23:20
It's basically just, you know, Matt Damon and, and and goodwill
00:23:23
hunting, but. So isn't Matt Damon good at math
00:23:27
in goodwill hunting? Yeah, well, so was.
00:23:30
So was the claim of open AI people?
00:23:32
Yeah, No, I think it is. Well, I think the Karpati thing
00:23:35
kind of summed up his appearance on Dork Ash and his statements
00:23:38
and kind of everyone's, you know, having a take about them
00:23:42
from both the hype and the skepticism.
00:23:45
It's not going far enough. It's sort of like if you give a
00:23:47
nuanced state of play in the industry to someone, everyone is
00:23:51
going to have a problem with it depending on where they're
00:23:53
coming from. Because he basically was saying,
00:23:56
I think, you know, that based on his understanding at the
00:23:59
technological level, things are very impressive.
00:24:01
We still have a lot to go bullish on AI overall, but AGI
00:24:06
is not going to show up in 2027 like Dario and some other
00:24:10
founders of the other big AI labs have said.
00:24:13
And I think, you know, that level of nuance was certainly
00:24:17
optimistic. He wasn't pessimistic.
00:24:19
I think it was framed everywhere like he, you know, is shitting
00:24:21
on the future of AI and thinks it's all wrong.
00:24:23
But. When Andre Carpathy speaks, the
00:24:26
AI world listens. You know, this is the guy who
00:24:29
came up with the term vibe coding.
00:24:32
I think he's very intelligent and very sort of plain spoken
00:24:37
and direct. He's not someone who puts all
00:24:40
these sort of false linguistical complications into his speech.
00:24:43
He sort of says what he's thinking.
00:24:45
He's trying to speak directly. And so I think just his style of
00:24:49
communication, his obvious intelligence and how close he is
00:24:53
to the development of this technology, people take what he
00:24:55
has to say very seriously. And he's not.
00:24:59
You know, we have people like Dario and Anthropic who, you
00:25:03
know, I like he's spoken at Cerebral Valley.
00:25:06
There's a lot to like about Anthropic but has made Santa.
00:25:09
Roman politically too. I mean, he's like, yes.
00:25:11
Yes, exactly. He's fighting with David Sachs.
00:25:14
Everyone who fights with David Sachs, you know, enemy to my
00:25:17
enemy. But the but you know, he's made
00:25:20
some Dario has made some pretty grandiose predictions about the
00:25:25
rate of AI improvement that I think at this point are not
00:25:28
coming true at at that rate of improvement.
00:25:30
And Sam Altman, obviously at open AI, you know, sometimes
00:25:34
he's saying we're overvalued and sometimes, but he's always like
00:25:38
talks in such big terms that it feels like hypey.
00:25:41
Even when he's not being hypey, it feels like it's big.
00:25:44
We need trillions of dollars. You know, we need more money
00:25:47
that's been ever raised sort of thing.
00:25:49
And so I think, you know, this Carpathy interview had a little
00:25:52
bit more realism, which is a stance that, you know, we've,
00:25:57
we've taken in Cerebral Valley, which is just sort of like it's
00:26:00
clear if you really work with, you know, Chachi BT clawed the
00:26:04
models. There's a lot of capability
00:26:06
there that, you know, if it had a little more guardrails, if it
00:26:10
was handed to the consumer in a more prepackaged way.
00:26:13
It seems, you know, we've seen that with coding, with all these
00:26:16
coding applications, if if you do that work, you connect it to
00:26:19
data, there's a lot of value to be had there.
00:26:22
And so I think, you know, the Andre point of view was like,
00:26:26
there's still all this value to be had connecting with the
00:26:28
models can do right to, you know, all this technology and
00:26:32
all this data, but this sort of sense that they're gonna get so
00:26:36
much smarter. We're gonna have AGI tomorrow.
00:26:39
He was he was pushing back on. That I think the point that he
00:26:42
really landed on here was in regards to agents, right?
00:26:45
There's tons of venture capital hype around agents.
00:26:48
Agents are being pitched everywhere is the next big
00:26:50
thing. He basically said the industry
00:26:53
is kind of overshooting the tooling with its present
00:26:56
capability, which anyone who's kind of used these agents that
00:26:59
are like claiming to be travel agents and book everything and
00:27:01
do multi steps for you new already.
00:27:03
And so it's kind of refreshing to have this very serious voice
00:27:07
of technical leadership in AI speak to you honestly and say,
00:27:11
yes, the agents are not quite there.
00:27:13
That does not mean that AI is, sorry to the skeptics, all not
00:27:17
going to work and won't ever get there.
00:27:19
It's just the timelines we need to be a little more realistic on
00:27:22
and we need to explore different technical angles to make these
00:27:24
things work. And maybe we haven't figured
00:27:26
those out yet. Well, the rise of agents was
00:27:29
something super interesting to me when I was covering
00:27:31
enterprises, enterprise businesses at the Wall Street
00:27:33
Journal, because you basically saw the trajectory of AI go
00:27:37
from, you know, Chachi, BT comes out.
00:27:39
It's a consumer phenomenon. Microsoft built their own search
00:27:42
engine and then basically Copilot was the term at that
00:27:45
period. These were assistants that sit
00:27:47
alongside you and can make you more efficient and can make your
00:27:50
job that much easier to do or help you write e-mail, shit like
00:27:52
that. And then with no major, in my
00:27:55
opinion, technical breakthroughs between the Copilot era agents
00:27:59
just sort of appeared on the scene.
00:28:01
And suddenly these things that were like debatable as to how
00:28:04
good they really could work became even more essential and
00:28:08
autonomous in what they were supposed to do for you.
00:28:11
And the technology didn't seem like it took, you know, a major
00:28:13
leap. And so I was kind of scratching
00:28:16
my head this whole time seeing Microsoft specifically go from
00:28:20
like, oh, we have this copilot tool and everyone should pay
00:28:23
extra for Microsoft 365 so you can use their copilot too.
00:28:26
Oh yeah. Now you're supposed to use
00:28:27
agents where it does all the shit for you.
00:28:29
You know, obviously you want to show you that you're an AI
00:28:31
forward company because that you get a premium on your stock.
00:28:34
And so you have to make all kinds of statements when it
00:28:36
comes to utility of this shit. Like they were just starting to
00:28:39
figure out Co pilots and so now you're gonna start claiming
00:28:43
there's agents. I just updated my Apple
00:28:45
operating system, right? And if there's any company
00:28:47
that's trying to force this AI thing that doesn't seem to have
00:28:50
a view of what the actual value it drives, it's Apple, right?
00:28:54
It promised all these phones based on Apple intelligence and
00:28:58
they, they haven't really figured out what the value is.
00:29:01
And it's like, you know, they they mess with your contacts and
00:29:04
they say, oh, here's who we think.
00:29:05
But it's like, I had a nice favorite list.
00:29:07
It was organized how how is your like smaller, weirder version
00:29:11
better than like my nicely organized 1.
00:29:13
And so, yeah, IA lot of these companies have an incentive to,
00:29:18
to force an AI narrative. I mean, Tom, you just wrote
00:29:22
about Salesforce, which, you know, Marc Benioff has been like
00:29:24
agent force, agent force, agent force.
00:29:26
I mean, he's been the most sort of egregious in terms of just
00:29:30
having, you know, the name of his company AI.
00:29:32
You know, he's a, he's a sales guy, you know, he knows how to
00:29:36
try to associate, but that that sort of heavy-handed approach
00:29:40
hasn't resulted in a good stock performance.
00:29:43
No, certainly not because they're stuck in this position
00:29:46
where, you know, you see the rise of these agents or AI
00:29:49
companies. And it's like either someone
00:29:51
like a sales force is going to be able to, you know, leverage
00:29:54
off of that and be able to sell these tools to the customers.
00:29:57
Or there's going to be a whole new wave of tools built by not
00:29:59
sales force that are just going to replace all of SAS.
00:30:02
And you've seen there would be like a negative feeling about
00:30:05
SAS for like the last year or so because of that worry.
00:30:09
But the knock on sales force is that their agents don't work
00:30:13
right. It's just like you're selling a
00:30:14
product to people that have no real utility.
00:30:16
It gets stuff wrong all the time.
00:30:18
And, and I think that's what Carpathy was really trying to
00:30:21
get at in his comments was like, look, there is a world in which
00:30:25
agents have incredible impact on the economy and the way we do
00:30:29
work and, and jobs and and the labor grid and all of these
00:30:33
things, but we're not there yet. And if you're selling it to
00:30:35
people now with the claim that it's, you know, 11 tick away
00:30:39
from AGI and one tick away from being able to replace a huge
00:30:42
percentage of your workforce, you're bound to be disappointed.
00:30:44
It's going to fuck everything up.
00:30:46
And I, I don't know. You know what's interesting to
00:30:50
me with his comments is like how much of the hype around AI and
00:30:54
the amount of money that Sam is consistently able to raise or
00:30:57
stock jumps that he's able to generate through announcements
00:31:01
are built on the premise that AGI is just around the corner.
00:31:04
Like how much value is there in just a good utility for the next
00:31:08
5-10 years if AGI just doesn't exist on the timeline that
00:31:12
people like Dario have been claiming?
00:31:14
I. Know we still have some of those
00:31:16
Edzitron fanboys hanging around just being like, what is this
00:31:19
podcast I I still, you know, think the models themselves are
00:31:24
so valuable. I talked to Chachi PT all day
00:31:26
long like is it not like a key part of your lives?
00:31:30
Like I feel like it's extremely valuable on day-to-day questions
00:31:34
relative to like Google. I don't know, Yeah.
00:31:38
You guys don't agree or? No, I use it.
00:31:41
I use it quite a bit. I've used it for helping me
00:31:44
figure out like recipes and stuff.
00:31:46
And it's a much better search tool for a lot of things.
00:31:48
But it's not even search because it does generate for you what
00:31:51
you need, you know, and provide things for you.
00:31:53
And it helps me, you know, proofread and things like that.
00:31:56
I found it to be an incredibly useful tool.
00:31:58
But I guess I do agree too, I think Kaprathi was getting at,
00:32:03
which I think is where we land here is he, he was landing at
00:32:06
how these can be great assistance and tools and this is
00:32:10
an evolution and they are better.
00:32:12
But what the problem is is you want it to make fewer.
00:32:15
He actually wrote this in his post.
00:32:17
I want it to make fewer assumptions and ask, collaborate
00:32:20
with me when not sure about something.
00:32:22
I want to learn along the way and become a better programmer,
00:32:25
not just get served mountains of code that I'm told worse.
00:32:28
And I was like, that's kind of the through where we're kind of
00:32:30
digging out of right now. But.
00:32:32
But that doesn't mean at all that these aren't useful tools.
00:32:34
Right. What do you want?
00:32:36
What do you guys make about Dario and his claims 'cause he's
00:32:40
doesn't have an amazing track record right now of like
00:32:43
timelines of when stuff is going to happen.
00:32:45
And the thing that I keep thinking about was his claim, I
00:32:48
think a year ago that basically at this time now more than 50%
00:32:52
of code would be generated by AI.
00:32:53
And I don't know what the actual number is, but everyone was
00:32:56
dunking on that because we're clearly not even close to that
00:32:59
right now. And it reminds me of yeah, yeah,
00:33:02
and it reminds me of, you know, the, the, the self driving era
00:33:07
where there were also outlandish claims about the timeline it
00:33:13
would take for this technology to be St. ready and, and and and
00:33:17
they were way off. And I was super bearish about
00:33:19
that. To be clear, at the time I was
00:33:21
in Ubers in 2016, I was like, they're intervening all the
00:33:24
time. We are not about to be in Ubers
00:33:26
without. Well, Uber, Uber in particular
00:33:29
didn't have record audit like their tech was bad.
00:33:32
But that's when they had the acquisition was auto, yeah.
00:33:35
I mean, there was a period where Uber was really trying to assert
00:33:39
that it was going to be the leader in self driving.
00:33:40
And I think I mean, you know, and come at us at Zitron haters
00:33:44
or the haters that Edwardson brought to the podcast.
00:33:46
But like I, you know I. Think I'd like to say to them
00:33:49
they're they're shooting at us in the comments.
00:33:51
It's our fucking podcast. We can talk.
00:33:53
We're supposed to talk like whatever.
00:33:55
They just want to hand over the mic to somebody else.
00:33:57
Yeah, anyway, go on. Yeah.
00:34:01
Get your piece on Eric that's. Your back, yeah, they're still
00:34:04
comments on that episode. They.
00:34:05
Can shit on me as much as they well, we're just talking about
00:34:08
AI bullishness or not. I'm fine with the negative
00:34:10
comments. I love the comments that are
00:34:12
like, I can't believe they've left the comments open.
00:34:14
It's like, yeah, we can, we have thick enough skin that we can
00:34:16
take a little bit of criticism. What I don't get is that on my
00:34:20
on our own podcast, we're not supposed like, wait, just
00:34:24
because you know that guy and he's on our podcast, we're not
00:34:26
just like handing it over to him to just say whatever he wants
00:34:29
the whole time. I.
00:34:29
Don't know, I'd like to see the tail of the tape.
00:34:32
I like Ed and we've continued. We let him talk so much.
00:34:35
Yeah, he had a lot of time. He had like the time of
00:34:38
possession was strong for Ed in. That episode.
00:34:40
So I have no idea a new. Objection.
00:34:42
Like if you're having like a through line argument, you need
00:34:45
to hold some. It's like we're arguing about
00:34:47
this thing and then you make points around it.
00:34:49
He would just be like, oh what about energy, you know?
00:34:51
Ohh anyway, I'm still still having flashbacks, clearly.
00:34:55
Look, anyway, I'll, I'll bring these haters back here and
00:34:57
saying like, I think self driving cars are really
00:35:00
impressive. I think I I do pay a premium in
00:35:03
San Francisco if I can take a Waymo versus an Uber.
00:35:07
And a lot of the promises, yeah, the timelines are way off, but
00:35:10
the technology did pan out to be non economical.
00:35:13
There's all kinds of issues around like, you know, how much
00:35:15
it costs to run these things. But like it proved itself at a
00:35:17
different timeline than was initially predicted.
00:35:19
And I guess the question is with with AI, you know, is there a
00:35:23
timeline that does make sense that that someone like Carpathy
00:35:26
agrees with where we do reach AGI?
00:35:27
Or is this capability of, you know, these models were Tachi BT
00:35:32
is now, which does seem like it's leveled off quite a bit.
00:35:36
Is that enough to, you know, meet the, the hype of something?
00:35:40
You know, it may not be to the level of displacement, but like
00:35:43
there's a lot of utility. We're all seeing that in in
00:35:45
these tools. And like, where does evaluation,
00:35:47
you know, net out with that sort of thing?
00:35:48
And I don't think he has an answer to that, but that to me
00:35:52
is the question we should be asking in the next year or two.
00:35:55
Hysterical, though unlike self driving cars, our whole economy
00:35:58
seems staying on this question. Right self driving cars were
00:36:01
more of like on the side. These companies can do this.
00:36:03
This was a fad. And with this I do see.
00:36:06
It is. Right.
00:36:06
You know, kind of. Maybe you got a little annoyed
00:36:08
that it's like, oh man, these companies are getting profiles
00:36:12
that act like they're, you know, geniuses when it like they can't
00:36:15
do what they say they can do. But like now I think we're
00:36:18
underrating self driving cars. But the but the yeah, the AI
00:36:22
stuff right now, I mean, just Nvidia's the most valuable
00:36:24
company, you know, in the world. It's it so much of our economy
00:36:27
hangs on continued adoption of artificial.
00:36:32
And, and specifically CapEx spending on data center growth
00:36:35
and that's like I would say 80 to 90% the responsibility of Sam
00:36:40
and striking all these deals that are basically, you know,
00:36:44
deals upon deals of, of data centers to be built.
00:36:47
And so like, yeah, the point I guess you're making is like,
00:36:50
yeah, it does need to be more than where it's at now.
00:36:51
Like you're not going to be building all these data centers
00:36:54
because AGI won't exist like that is, well, we'd like to.
00:36:57
See another leap? You know, it's like, OK, Chachi
00:37:00
PD has been great coding's cool, but like, I feel like our minds
00:37:05
haven't been blown, you know, a little bit right?
00:37:08
I mean, we we get impatient. When's the next the video stuff
00:37:11
is mine the. Video I was gonna say sora was a
00:37:13
big leap for the new sora app it you can still which.
00:37:17
Is still number one on the App Store by the way I checked.
00:37:19
Is it OK? Interesting, yeah, I see.
00:37:22
I see the ones that make it over to TikTok but I I don't love
00:37:26
them like. Well, there was, I will say
00:37:30
this, this deep, this Sora video of like a a pastor, you know,
00:37:34
condemning billionaires went pretty viral over this week that
00:37:39
a lot of people fell for, including a lot of my.
00:37:41
Friends, yeah, I like they are getting reality.
00:37:43
Like I already hated the like I hate those.
00:37:46
The people love those like skits where it's like clearly these
00:37:49
are actors, but they get mad at it in the comments.
00:37:52
You know what I'm talking about where it's like, yeah, it's like
00:37:54
a husband and wife are fighting or something.
00:37:56
But like all the comments, it's like, you know, treat it as if
00:37:59
it was like real where they're like, this guy's a jerk or
00:38:01
whatever. You know, it's like, but these
00:38:02
are clearly actors. I feel like the sort sort of
00:38:05
thing is like the next evolution of it.
00:38:06
I just hate the media consumer that like buys into the fiction.
00:38:11
Like they don't, they don't care like if it's real or not.
00:38:14
They're like, oh, it feels real. And so I'll just react to the
00:38:17
character. Because it's like.
00:38:18
Wouldn't it be great if you. Know this is happening.
00:38:20
There's been a there's been a tick tock meme, I don't know if
00:38:23
you've seen it, of umm, putting various historical figures in
00:38:28
the NBA and talking about how like their career is sort of
00:38:30
ending. So it's like Queen Elizabeth and
00:38:33
like in Celtics jersey being like I'm just sort of old now
00:38:36
and like her like at the. Podium talking about.
00:38:38
Tiring. And then it cuts to it's really
00:38:41
fucking funny. I laugh every time like it's
00:38:43
there's one with Queen Elizabeth, Winston Churchill.
00:38:48
Who else has there been anyway? But so I watched them and I was
00:38:51
like trying to show Rosa them and she's like, this is stupid.
00:38:54
Stop showing this to me. But I and I feel kind of gross
00:38:57
after I watch them too. Like I laugh at it.
00:38:59
And then in the moment I was like, this truly is slop.
00:39:01
I mean, this is the lowest form of of entertainment.
00:39:04
And I understand these things will evolve over time in terms
00:39:07
of acceptance culturally. But as it stands now, like,
00:39:10
yeah, it does entertain me. I do laugh, but I don't feel
00:39:13
good about it. But but the dream is that this
00:39:15
is like some Tom specific video, like the dead.
00:39:18
They've figured out some sliver of human psychology, and for
00:39:21
some reason you think this odd thing is funny.
00:39:24
And then they can target us all and we get all these weird like
00:39:27
Freudian, like, oh, I like, you know, you, you have this weird,
00:39:30
I don't know, yeah, it's just funny to me that you, you find
00:39:33
it amusing. You haven't found many other
00:39:34
people who find it funny. It's like this is this is the
00:39:37
real dream of these AI generated videos that they can micro
00:39:40
target everybody and. Right.
00:39:42
And then? The panners to each of us.
00:39:45
And it's engineered to be addictive, which is why there's
00:39:47
like, you know, this existential fear in Hollywood over like, how
00:39:50
can we compete against this, right?
00:39:52
If it's all about a battle for attention, It's really hard to
00:39:55
say like original story versus thing that like is almost to the
00:39:59
precision of a casino in terms of how it.
00:40:02
What is your attention amazing about language models that they
00:40:06
haven't got enough credit for is that they were all built around
00:40:09
the truth. They were trying to be like, how
00:40:11
can we not hallucinate? How can we be as accurate as
00:40:13
possible? Unlike, you know, social
00:40:16
networks, which were built on engagement.
00:40:18
And it's like, we'll deliver whatever we can get you to stay
00:40:21
engaged. Obviously, AI is more and more
00:40:23
involved in that, but we'll use the fact that the content has
00:40:26
been created by humans as our excuse that we don't have to
00:40:30
worry about the accuracy of these humans created it, even if
00:40:32
we decided to distribute some of it and not others.
00:40:36
And now obviously, something like Sora, you're like, screw
00:40:39
it. Like let's just do something
00:40:41
based on engagement. And that, you know, that's a
00:40:44
dark path. Well, right.
00:40:46
And then the question is, and I talked to Varun Shetty, who's
00:40:50
the head of media partnerships and Open AI about this.
00:40:53
Like are we supposed to view something like Sora as an off
00:40:57
ramp as just sort of like some entertainment to the side while
00:41:01
you guys are pursuing your path towards AGI or is it critical
00:41:04
towards getting there? And his argument was the latter
00:41:06
is that like, actually you can gain a huge amount of
00:41:08
information about the real world through generating videos.
00:41:12
And I don't know the, the training process that it takes
00:41:15
to do that, which is controversial on its own.
00:41:18
And so that point it's like, we'll improve it.
00:41:21
Like you guys have made this viral Sora app that has gotten a
00:41:24
ton of people addicted to it still number one on the App
00:41:27
Store. Like right now Open AI has the
00:41:28
number one and #2 slots on the App Store.
00:41:31
Like you guys should really be getting towards AGI now, like
00:41:34
'cause this is all just an off ramp towards slop and
00:41:36
entertainment and replacing of social media at the very least.
00:41:39
That's disappointing to investors and, you know, society
00:41:44
at large. I'm sure it takes a huge amount
00:41:46
of GPUs to get this stuff done. But yeah, it's like, I think
00:41:50
they need to keep reminding people.
00:41:52
And I think it's good for people like Karpati to keep talking
00:41:55
about that is like, we know what the goal has to be.
00:41:57
Let's keep in mind what the you know what the timeline is and be
00:42:01
clear eyed about overhyping it. But like this is not about
00:42:05
creating slob. I, I wanted to talk about, you
00:42:08
know, the New York Times did this story on Amazon that
00:42:11
basically said, you know, Amazon is set on automating a bunch of
00:42:15
its workers. You know, Amazon, I think is the
00:42:17
second largest employer in the United States, a company that
00:42:21
really believes in robotics, already has many robots and
00:42:24
they're driving to have even more robots doing work and that
00:42:29
this is part of their grand strategy.
00:42:32
And obviously there are robots that are not built on language
00:42:35
models, but you know, the improvements with foundation
00:42:39
models and language models have improved computer vision and now
00:42:44
in some cases they're powering robots entirely.
00:42:46
And so we're seeing improvements in robots relate powered in some
00:42:51
cases by AI that could mean more automation.
00:42:56
I'm curious what reaction you guys had to that story.
00:43:00
I mean, this has sort of been a long time coming for Amazon.
00:43:03
Like they have been building up their workforce.
00:43:06
It's always been this kind of flex situation where they, like,
00:43:09
over hire during the pandemic, and then they had to lay a bunch
00:43:11
of people off and that caused a huge amount of turmoil there.
00:43:14
But there's no question that long term, this business wants
00:43:18
technology allowed, you know, robots to be able to put stuff
00:43:23
on and off shelves or whatever. That's where it was going to go.
00:43:26
So I was kind of surprised to see people mad about it.
00:43:28
I understand, like, yeah, people losing their jobs is an
00:43:31
upsetting prospect. But certainly given that it's,
00:43:34
you know, to your point, probably one of the largest
00:43:37
employers in the US, but at the I, I guess I agree with you,
00:43:40
Tom, I was kind of surprised to see people kind of freak out
00:43:45
about this story in that it seems like that's definitely was
00:43:49
the goal the whole time. I don't know it, it's as if it
00:43:52
was something new, you know, as if it was something new that
00:43:54
automation is where this was going.
00:43:55
It's just smarter automation, frankly, for.
00:43:57
A lot. I just hated how ominous the
00:43:59
story was. Like, I don't, yeah.
00:44:02
I mean, Amazon's going to try and do things as cost
00:44:05
effectively as possible. What?
00:44:07
What? What else?
00:44:09
Yeah, I mean, there, you know, and this one is more realistic
00:44:12
than, like, drone delivery or other things that, you know,
00:44:14
Amazon was more public about. But it's weird to see.
00:44:19
You know, I remember, like, height of the pandemic, everyone
00:44:21
was looking at the way that Amazon was hiring all these
00:44:24
people. And remember that movie Nomad
00:44:26
Land came out where you had Frances Mcdormand's character
00:44:28
working temporarily as like an Amazon, which was done with
00:44:32
partnership with Amazon. So it wasn't an anti Amazon
00:44:35
thing. But we were all kind of
00:44:36
wondering like, Oh my God, is like that going to be the new
00:44:39
blue collar job for everyone across the country?
00:44:41
And I, it's weird to see people looking at a job that previously
00:44:47
we thought was like beneath the dignity of most human beings.
00:44:51
And, you know, they're anti union.
00:44:52
And that really pissed off a lot of people.
00:44:54
The working conditions are really terrible.
00:44:56
People were like, you know all the expose about people peeing
00:44:59
in bottles to get the delivery quotas met, that kind of stuff.
00:45:02
A lot of outrage about the jobs before.
00:45:05
Yeah, and you know, Amazon did respond to that stuff and start
00:45:08
paying people more, you know, out of public pressure from
00:45:10
people from like Bernie Sanders. And remember AOC was
00:45:13
successfully got Amazon to not build HQ 2IN and around New
00:45:17
York, which kind of was an early win for her in the left wing
00:45:20
movement. So I, yeah, I don't know, like
00:45:25
this is just the direction of technology and has been for like
00:45:28
dozens of years. They've automated huge portions
00:45:31
of assembly lines in, in, in car manufacturing and stuff too.
00:45:35
I mean, this is not like skilled labor in most respects.
00:45:39
This is putting stuff in boxes. And I'd love there to be a world
00:45:42
where people that had those jobs had more intellectually
00:45:45
inspiring things to do or there's a Ubi that gets people
00:45:48
to do other things. I mean, that to me seems a more
00:45:51
reasonable conversation than worrying about, right?
00:45:55
Companies should automate boring, dumb stuff nobody wants
00:45:59
to do, and we should elect politicians who create a safety
00:46:02
net so that every American has healthcare and nobody starves.
00:46:05
But we shouldn't handicap Great American companies and say they
00:46:09
can't use current technology to, you know, get me my, I don't
00:46:14
know, battery packs here slightly faster or whatever.
00:46:18
I ordered from Amazon, you know, in the last last day or two.
00:46:22
Yeah, I don't know. And true leftists, I think,
00:46:24
understand that as well. I mean.
00:46:25
Do they, I don't know, I just feel like what annoys me about
00:46:28
this kind of story is the, you know, the haters from both
00:46:31
sides. The haters are like AI sucks and
00:46:34
AI is killing all the jobs. It's like, which is it?
00:46:36
That's that's what drives me crazy.
00:46:37
This sort of like just ominous people who feel like they're
00:46:41
driven by like ominous vibes and just like depressives reacting
00:46:45
to everything with that is their sort of lens and then find a way
00:46:48
to be negative about everything. That that's what drives me crazy
00:46:53
just socially and interpersonally.
00:46:54
I realize it's a small concern, but that's why these, I don't
00:46:58
know, the The New York Times story just had that sort of like
00:47:01
negative feel. It's like what?
00:47:03
Yeah. Yeah, I understand the concern
00:47:07
and fear about it. I mean, if we are talking, I
00:47:09
mean, there are cities that were basically destroyed economically
00:47:13
because of, I mean, globalization, but also the
00:47:16
automation of a lot of jobs Like Detroit, you know, in many ways
00:47:19
has not really recovered from many of these kind of third
00:47:23
party assembly line businesses, not, you know, meeting as many
00:47:26
people on the line as they used to.
00:47:28
And, you know, I, I don't think the job of a union should
00:47:31
primarily be to stop the progress of technology.
00:47:35
I'm seeing a lot of this happening in Hollywood too,
00:47:36
right, With AI. It's like, what role does a
00:47:38
union have to play when you start to see a potential job
00:47:42
displacing technology come about?
00:47:43
And really, your job is to protect the livelihoods of the
00:47:46
people who you know are affected by it.
00:47:49
But trying to stop something from existing just doesn't.
00:47:52
It doesn't fit with like the March of time.
00:47:54
There's just not great success stories of stopping technology
00:47:57
from existing so. I well the worst ones are these
00:48:00
attempts to stop like ports from automating moving like
00:48:05
containers around which is just insane.
00:48:08
Oh, like the dock workers unions and.
00:48:10
Yeah, yeah. Trying to block, you know, the
00:48:13
automation of ports, which in our entire economy rests on
00:48:17
moving these things in and out. Yeah, I feel like they should
00:48:20
also be looking more at like the types of entities that are
00:48:22
buying ports in America, which are like also fairly insane.
00:48:26
I mean, you have a huge amount of Middle Eastern money
00:48:27
essentially buying up and building new ports in the
00:48:30
country. It's the only people who are
00:48:31
investing in that sort of stuff. So I'd be more interested
00:48:36
following the money there rather than saying we need to protect
00:48:39
man. I don't want to get the dock
00:48:41
workers pissed off at us and get that.
00:48:43
Really. Comment section is full of
00:48:44
Steven. No.
00:48:45
People who the dock. Workers.
00:48:47
Finding our podcast No. I'm not a union diehard, I.
00:48:53
Know I'm the leftist on this podcast too.
00:48:55
Wait, I'm the former Union maiden, but.
00:48:58
Are you? Yeah.
00:48:59
Yeah. Maiden was the title.
00:49:02
Was my nickname. No, no, I was involved in the
00:49:05
Business Insider union, but I feel like the attitude there has
00:49:09
always been, you know, much more just, I guess not wanting your
00:49:13
job completely automated. But it wasn't resistant to AI
00:49:17
coming into The Newsroom at all. It was just like, how do we use
00:49:20
this strategically? Do we use this to actually write
00:49:22
copy? That's kind of the line that we
00:49:24
land on, but you know, for brainstorming.
00:49:26
And I mean, I feel like as an editorial rule, you know, I
00:49:29
think it's kind of where it's landed in a lot of other news
00:49:32
companies to, you know, not having, you know, plug your
00:49:36
story into ChatGPT and just publish it, you know, not being
00:49:39
what's expected of you well. Within journalism to like the
00:49:41
types of jobs that an AI could do or would be doing or again,
00:49:45
not very good jobs. I mean, certainly copy editors,
00:49:49
yeah. I don't even think those people
00:49:50
were part of the BI union, but it's like.
00:49:52
They were but your article. Yeah, our human copywriter still
00:49:56
generally outperforms Chachi BT, but.
00:49:58
They chachi BT yeah is not quite as doesn't catch all the pieces
00:50:02
as as a. Copywriter also, it hasn't.
00:50:04
It's hard to teach it, obviously our house style and things like
00:50:07
that, but I'm sure that's solvable.
00:50:09
All right, anyway, that's our episode.
00:50:11
Thanks Eric for for dropping on in.
00:50:14
I think I don't know what are you going?
00:50:15
To be in for a. Couple more episodes or what's
00:50:16
the no? No, I'm back.
00:50:18
Well, we're next. We're going to I think Cerebral
00:50:20
Valley is going to take over the podcast.
00:50:22
Max Child and James Wilsterman will be back on.
00:50:25
We have, we love doing our draft of AI startups.
00:50:28
So if you thought this was, you know, somewhat pro AI, we're
00:50:32
about to take AI by the fire hose for the next couple
00:50:37
episodes. So that'll be a lot of fun.
00:50:39
I mean, we're, you know, we're going to be prepping for all
00:50:41
these interviews with the CEO's of top AI companies.
00:50:45
We have execs from Anthropic, XAI, Replit, Versal Speaking and
00:50:52
so. We're Co versal.
00:50:53
I'm sorry, I didn't know this is that.
00:50:54
Yeah, Guillermo is speaking. All right, Guillermo, we got
00:50:57
some questions for you. Interesting photo OPS that I'm
00:50:59
sure will come up at some point in the conversation.
00:51:02
She's. Just we, well, we have both
00:51:04
Amjad at Replit and Guillermo Versal speaking, but I think
00:51:07
we're and they've feuded a little on Twitter over
00:51:10
international politics, but that is not the theme of our
00:51:14
conference. We're talking about AI, but
00:51:16
yeah, they. I don't think they're.
00:51:18
On the same panel. No, no.
00:51:20
And I don't think we're putting them next to each other.
00:51:22
This isn't. This isn't Dario and David Sacks
00:51:24
showing up at the Benioff party last week.
00:51:26
Oh, how was that? Did they interact?
00:51:29
Nope, Nope. And Dario?
00:51:30
I overheard him. Here's a blind item for you.
00:51:33
Dario kind of muttering to, I believe his PR person.
00:51:36
Why am I here? Oh my God.
00:51:41
So we we love a good pairing here at here in tech, but
00:51:45
specifically at the the newcomer sub Repo Valley conference.
00:51:48
Yeah, so that'll be fun. And this comes out.
00:51:53
Oh yeah, this will come out this week.
00:51:55
I, I should have a post in, in the newsletter about my child.
00:51:58
I, I, I've sort of, you know, I, there's an Instagram post, you
00:52:02
know, they're, they're bread crumbs, but I will talk about
00:52:05
having a child and I've quietly been on paternity leave.
00:52:08
I appreciate you guys stewarding the podcast in my absence.
00:52:11
It feels you had some good episodes and I think we keep
00:52:14
learning and improving the show and we've been punching up
00:52:17
production and so excited to yeah, pick up from where you
00:52:21
guys left off and keep keep growing the audience and
00:52:24
improving the show. And we love your feedback.
00:52:27
We subscribe to the newsletter, follow along and definitely tell
00:52:31
us what's working. You know, we want we want
00:52:33
feedback. What do you like?
00:52:33
What don't you like? Send us an e-mail.
00:52:36
Alright, thanks everyone. See you next.
00:52:38
Week. Alright, thank you.
00:52:39
See you. Bye.
00:52:39
Thank you for tuning in to this week's episode of the podcast.
00:52:42
If you're new here, please like and subscribe.
00:52:44
It really helps out the channel. Listen in for new episodes every
00:52:46
week, wherever you get your podcasts.
