This week on the Newcomer Podcast, Madeline and Tom are joined by Alex Heath to dig into some of the biggest questions in tech right now.
We ask: Is the AI bubble about to burst? OpenAI is propping up huge partners like Microsoft, Oracle, and Broadcom — but what happens if their momentum slows? Meanwhile, Meta just launched Vibes, a quirky new product that seems far removed from the company’s AGI ambitions. Does this mean the AI hype cycle is already shifting?From venture capital’s bets on AI, to Meta’s surprising pivots, to the fragile foundations of the current AI boom, this episode unpacks the stakes for Big Tech, startups, and investors alike.
00:00:00
Is the AI bubble going to burst? This week, Tom Daton wrote an
00:00:03
article for our Sub Stack that explores the web of mutual
00:00:06
dependence between Open AI and its partners, and how if Open AI
00:00:10
fails, the tower of cards may fall with it.
00:00:13
We're being joined by Alex Heath, founder of the Sources
00:00:16
Sub Stack and Co, host of the Access podcast, to explore how
00:00:19
much of the entire economy is being propped up by AI.
00:00:22
We're also going to talk about Open AI's new Sora video feed
00:00:26
and Meta's launch of their new product that they're calling
00:00:29
Vibes. Introducing Vibes, the new
00:00:31
platform from Meta that lets people with no life create,
00:00:34
share and watch short form AI videos.
00:00:37
Sucks right? Both are hot on the heels of
00:00:39
Mark Zuckerberg's high profile poaching and reflect the world
00:00:42
of products that feel further and further away from AGI.
00:00:46
This is the newcomer podcast. This podcast is supported by
00:00:59
Google. Hey folks, Steven Johnson here,
00:01:02
Co founder of Notebook LM. As an author, I've always been
00:01:05
obsessed with how software could help organize ideas and make
00:01:08
connections, so we built Notebook LM as an AI first tool
00:01:12
for anyone trying to make sense of complex information.
00:01:16
Upload your documents and Notebook LM instantly becomes
00:01:19
your personal expert, uncovering insights and helping you
00:01:22
brainstorm. Try it at Notebook
00:01:25
lm.google.com. So, Tom, you had a good story
00:01:29
out this week about how the entire economy is seemingly
00:01:33
propped up by open AI. Are we?
00:01:34
Are we about to, you know, hit financial apocalypse if this
00:01:38
whole thing runs out? Sell That's that's my one take
00:01:42
away from from the story. If if you, if you could do one
00:01:44
thing as we've clearly reached a peak and everyone needs to sell.
00:01:47
No, I, you know, this, this piece that I wrote.
00:01:51
I was really struck by a converse two things.
00:01:53
I had a conversation with just an analyst a few weeks ago and
00:01:57
he was mentioning to me that that open AI has basically
00:02:01
become a systemic risk to the tech industry to, to to the tech
00:02:06
economy. And he wasn't even saying it as
00:02:08
a negative thing. It was more just that the number
00:02:10
of companies that have been connected to and gotten a glow
00:02:14
up essentially from Sam and an open AI has grown.
00:02:18
And it, you know, partnering that with the number of deals
00:02:23
that Open AI has been announcing in the last couple of weeks, we
00:02:25
can rundown them, you know, later in the conversation.
00:02:28
And the fact that like making announcements seems to be core
00:02:32
to what Open AI has been doing recently.
00:02:34
And and so the other thing that struck me was this very press
00:02:39
savvy and I think a little ham handed move by Open AI to invite
00:02:42
all these reporters out to Abilene to kind of gawk at their
00:02:47
sprawling data center and bask in the incredible ambitions of
00:02:50
Sam Altman and the tech world that he has built around him.
00:02:54
The fact of the matter is there were a lot of reporters on the
00:02:56
ground. They're mingling with Open AI
00:02:58
and and Oracle and other executives there.
00:02:59
And this is all part of this big show by the company to basically
00:03:03
show off where, you know, where's all your money going and
00:03:05
why should you keep writing about us?
00:03:07
And like, this is where the money is going essentially,
00:03:12
right? It's going towards these, you
00:03:13
know, giant data centers, the current 1, you know, being
00:03:17
constructed and of increasing ambition in Abilene.
00:03:20
And there was like 3 or four other cities that they said they
00:03:23
were going to be building data centers into.
00:03:24
And what I was trying to like get at in this story is like
00:03:29
when the numbers are getting to the point that they are where
00:03:32
it's like a $40 billion investment from SoftBank, a $300
00:03:36
billion Oracle deal, a $10 billion Broad, you know,
00:03:39
Broadcom deal. There's also the 100 billion
00:03:41
NVIDIA deal too. Right, sorry, of course, right.
00:03:44
The NVIDIA deal which was announced the day before, you
00:03:47
know, this, this Abilene trip, it's like all the stocks of
00:03:50
those companies that I mentioned that are public have gone up
00:03:54
considerably after the announcement.
00:03:55
And so you it really starts to make you wonder, what does it
00:04:00
mean for a company to get its stock valuation hugely inflated
00:04:05
off of a deal that is going to be paid for with money that the
00:04:07
company doesn't currently have? Well, it's also going to be paid
00:04:10
for with debt. Yeah, a lot of its debt, sure.
00:04:13
Certainly debt and and sovereigns and there there's no
00:04:16
doubt to the, well, actually I don't know where do you stand on
00:04:19
this, Alex? I mean like not that we
00:04:21
understand the full world of debt markets and you know the
00:04:24
the cash flow of various sovereigns, but like is he going
00:04:29
to keep raising this amount of money consistently to be able to
00:04:32
fund all of these deals? I mean, I think you have to, as
00:04:36
my grandmother said, pay the piper eventually.
00:04:39
So this is a giant money snowball and there has to be an
00:04:47
outcome on the other side. And yeah, what you were talking
00:04:50
about, all these companies with their stocks popping on these
00:04:52
deals that are multi year tranches based on certain
00:04:57
thresholds being reached, either a mix of equity, a lot of debt,
00:05:02
debt that has not been raised yet.
00:05:05
That's the bubble, right? That's the AI bubble.
00:05:08
And you know, I was reading your your column where you list all
00:05:11
these companies that are so intertwined with open AI.
00:05:15
You left some out. Actually, it's gotten to, it's
00:05:18
gotten to such a point where Reddit's stock went down like
00:05:22
almost 10% today, the day we're recording, because some third
00:05:27
party web traffic analyzer firm said that like a low single
00:05:33
digit percent of their Chachi PT links that they were getting.
00:05:38
And Chachi PT have like disappeared like something very
00:05:40
like random just from like a similar web or something of
00:05:44
like, yeah, like Chachi PT is citing Reddit a little less and
00:05:48
the stock goes down like 8%. Interesting.
00:05:50
It's like this. I didn't.
00:05:52
Yeah, that's. That's the bubble.
00:05:53
It's like, I mean, I think Shopify's stock went up when
00:05:57
they announced the, the, the merchant partnership with Chachi
00:06:01
PT where you can buy things in Chachi PT.
00:06:05
The froth and the excitement about this is just insane right
00:06:08
now. Right.
00:06:09
And, and the volatility, right, that something can go up and
00:06:12
down, you know, double digit percentages at times because of
00:06:15
a deal whose value isn't even really figuring into the bottom
00:06:19
line or the top line of most of these companies.
00:06:21
Reddit's an interesting one actually, because I don't know
00:06:24
this. Maybe you do.
00:06:24
But like, they signed a licensing deal.
00:06:27
Yeah. Open AI did with Reddit for
00:06:28
training. Did their stock move much on
00:06:30
that? I imagine, Yeah.
00:06:31
It did when they when they, well, I think they announced it
00:06:33
right before they went public. But yeah.
00:06:35
They did pop, I think in some extent in their IPO on part
00:06:39
because people understand that Reddit is such a great source of
00:06:43
human generated data for for large language models to train
00:06:47
on. And so if you know you have
00:06:48
these licensing deals with Open AI, that's a great business for
00:06:52
your social media company if you're kind of the remaining
00:06:54
trove of human generated data. Yeah.
00:06:57
And look, if we were talking and TikTok was a public company,
00:07:02
maybe it will be, maybe this Frankenstein TikTok US joint
00:07:05
venture will be soon. But if TikTok was public and the
00:07:09
Sora 2 launch happened like it did yesterday, the day before
00:07:12
we're recording now, I guarantee you TikTok stock would have gone
00:07:15
down based off of an invite only beta app that's super buggy,
00:07:19
that has no monetization, that who knows how much open AI is
00:07:23
committed to it. That's the IT feels like
00:07:26
everything is on a nice edge around open AII think you
00:07:29
captured that in the piece, but I think it's actually just much
00:07:31
broader than we think. 100% yeah, there's, there's a much
00:07:35
larger version of it. Maybe someone else can do that
00:07:38
talks about the second order effects of and, and it's
00:07:40
predictive effects, right. It's just the way people assume
00:07:43
things are going to play out because again, we're not dealing
00:07:45
with money that open AI really has.
00:07:48
I also was thinking on the, you know, if the TikTok, a
00:07:51
prospective TikTok stock going down after this Sora 2 launch,
00:07:54
you know, after the DeepSeek moment at the beginning of the
00:07:58
year, if Open AI were a public stock, it would have collapsed
00:08:03
at least briefly with people assuming that, you know, you,
00:08:06
they've been undercut in price and, and, and, you know,
00:08:09
comparable quality. So it's, it's such a crazy
00:08:12
volatile moment. And you know, the company that I
00:08:15
homed in on with this, I mean, I mentioned a bunch, but the one
00:08:18
that fascinated me the most is Oracle, of course, because
00:08:21
Oracle has gone through this pretty incredible transformation
00:08:26
over the last six months. Basically where this boring ass
00:08:30
legacy database company that had basically missed on out on cloud
00:08:35
entirely. Like they were just getting
00:08:37
washed by all the hyper scalers managed to snag this open AI
00:08:42
data center deal. And it suddenly became like a,
00:08:45
you know, a legitimate player as a, as a, you know, as a cloud
00:08:49
computing company. They're an AI company all of a
00:08:51
sudden. Yeah, Larry Ellison's the
00:08:53
world's richest man now. Briefly, briefly.
00:08:56
Was oh, yeah, that's true. He slipped.
00:08:58
He slipped. But yeah, he's, I'm not going
00:09:01
to, you know, cry for him. His his losses here.
00:09:04
Yeah, but, but yeah, they've managed to kind of redefine
00:09:08
themselves all because of this one deal that came into
00:09:11
existence because Microsoft basically said they didn't want
00:09:14
to go in on, on Stargate on this, you know, huge data center
00:09:18
vision that Sam had. And so I, I, I don't really know
00:09:22
where the Oracle story is, is going to go.
00:09:24
Like it it, their stock has been down, you know, 15% or so in the
00:09:28
last week 'cause I think people are are sort of leveling off the
00:09:31
euphoria, but. Well, they have to go out.
00:09:35
They have to go out and raise the debt and they have to
00:09:37
actually build the data centers and finance it.
00:09:39
They don't have the cash flow to finance it themselves.
00:09:41
So basically what Sam Altman did, which is pretty genius, is
00:09:45
be like, OK, Larry, you figure it out and they have to figure
00:09:48
it out now. And they can probably use stock
00:09:51
inflation to help, you know, but they don't have the cash flow.
00:09:55
It's very telling to me, and I don't think anyone's really
00:09:57
gotten to the bottom of this, that a company with tremendous
00:10:01
cash flow, one of the leading hyper scalers, Microsoft that
00:10:04
owns a huge chunk of open AI has, you would think
00:10:06
structurally every incentive to keep building out compute with
00:10:10
open AI has stopped doing so. What does that, what does that
00:10:14
mean? I don't think anyone's really
00:10:15
answered that question, but you know, it would be very natural
00:10:19
for Microsoft to continue to partner with open AI and all
00:10:22
this stuff, and the fact that Sam is going to Oracle that
00:10:25
doesn't have the cash flow to support this instead is pretty
00:10:28
telling to me. Yeah.
00:10:30
And this is one of those annoying moments where, like,
00:10:32
we're all reporters here, so rather than speculate, we should
00:10:35
probably figure it out. Yeah, we should just go and get
00:10:38
get to the core of it. But since we are here and we
00:10:41
don't know the reporter yet, we might as well speculate on it.
00:10:46
You know, I came into this piece initially thinking Microsoft had
00:10:49
fucked up. You know, you looked at the
00:10:51
Oracle pop after the the Stargate, you know, announcement
00:10:55
of their earnings. And you're like, man, that
00:10:56
should have been Microsoft's money, right?
00:10:58
That should have been them appreciating even more because
00:11:01
they were the ones that we're going to commit however many
00:11:04
more 10s of billions of dollars to building, you know, Sam's
00:11:07
Tower of Babel. And then the more I thought
00:11:10
about it, I was like, yeah, but maybe, maybe not.
00:11:14
Maybe it's actually OK to not buy into every single thing
00:11:17
that, you know, you're insane, megalomaniacal, you know,
00:11:21
partner wants to go into. And they run a real business
00:11:24
with Azure. You know, it does $75 billion a
00:11:27
year. Do you really want to hinge your
00:11:30
entire future on this company that's already been fairly
00:11:32
unstable? And they all freaked out after
00:11:35
the, you know, Sam Altman blip. And I I don't again, this is the
00:11:40
annoying speculation, but I don't know how much like all
00:11:42
that shit figured into the decision not to go in on
00:11:45
Stargate. But I could at least follow the
00:11:47
logic of like, there's a reason to maybe not do it.
00:11:50
And maybe they're not, you know, the most, you know, they're not
00:11:53
maybe not the biggest loser in this moment that it might have
00:11:55
seemed after the Oracle bump happened.
00:11:58
Yeah, I mean, we actually do have a little reporting now.
00:12:01
I'm remembering I was at that dinner with Sam Altman where he
00:12:04
talked about the AI bubble a couple months ago.
00:12:07
And actually I'm remembering now I which.
00:12:09
Moved stocks again. It did, which was wild.
00:12:11
I should have just like, yeah, posted that during I thought
00:12:14
about like live tweeting the dinner and seeing what would
00:12:16
happen to like the Bloomberg Walter Bloomberg Twitter
00:12:18
account. But so.
00:12:22
I. I asked Sam and Brad Lycap, who
00:12:25
runs kind of has been running a lot of this, this partnership
00:12:28
stuff about this at the dinner. I was like, why is Microsoft not
00:12:31
doing this? It seems strange to me.
00:12:33
And the version of their response was like, you should
00:12:36
ask them. But basically it's that no one
00:12:41
hyper scaler can support the ambition that Open AI has and
00:12:45
that they're continuing to work with Azure.
00:12:46
And I know that's true. It is just strange.
00:12:50
Yeah. It's just strange when you see
00:12:51
like the market response to what Oracle's doing.
00:12:54
And Nadella, to me, comes across as a pretty even keeled
00:12:59
financial tactical CEO. And the fact that he's sitting
00:13:05
out of this hype hysteria moment, to me is pretty, pretty
00:13:09
telling. Yeah, and he of course, had that
00:13:11
pretty pretty, you know, vicious is the wrong word.
00:13:15
But he had like a nice kind of on, on Sam after the Stargate
00:13:20
announcement being like, well, we're good for our $80 billion.
00:13:23
Like, we actually have the money to fund our ambitions.
00:13:26
And, you know, people were like, oh, that guy's gangster.
00:13:28
He's badass at the time. And then when you saw the Oracle
00:13:31
pop, you're like, yeah, maybe he looks kind of like a wuss, you
00:13:34
know, that he was, you know, only willing to stick to the 80
00:13:36
billion that's actually just pocket change now for the
00:13:40
trillions that Sam is talking about.
00:13:43
But I, I wonder, you know, when it comes to the money, when it's
00:13:49
going to start looking a little shaky, you know, when we're
00:13:52
going to start seeing Sam or any of the companies that have to
00:13:56
raise money to be able to fund Sam's ambitions, see that
00:13:59
there's some pushback from the financial markets and you start
00:14:03
to see a sovereign saying like, well, we're not going to go in
00:14:05
for that level. We MGX, you know, we don't
00:14:08
believe in it to to that degree or show us more revenue or the
00:14:12
debt markets maybe saying like Oracle, the second tranche of
00:14:15
like the $20 billion that you want to raise, like we're going
00:14:19
to Jack up the rates on it because we just don't know how
00:14:22
this is going to pay off for you in the near term.
00:14:24
And I, I don't follow this stuff too closely, but like the rating
00:14:29
agencies are probably going to start, you know, looking at some
00:14:32
of these bond offerings and saying like this is looking a
00:14:34
lot more precarious than it might have seemed a couple of
00:14:37
months ago. So.
00:14:39
I got into a little, yeah, I got into a little bit of an exchange
00:14:41
with Altman at that dinner about this because he was talking
00:14:43
about, he like looked at all of us at the table and like Stone
00:14:46
face said, you should expect us to raise trillions, right?
00:14:49
And everyone's like eating and we're just like, OK, and, and
00:14:53
I'm like, where are you going to get the money, Sam?
00:14:55
Where, you know, Sam is a if, if Sam is good at anything, it's
00:14:59
raising money, right? He's a, he's a career VC.
00:15:02
He does a lot of like investing on the side.
00:15:04
He knows where all the pools of capital in the world are.
00:15:06
I said that to him like almost verbatim.
00:15:08
I was like, where are the trillions?
00:15:10
And he got eyes kind of lit up and he started talking about
00:15:14
he's going to come up with this new financing instrument.
00:15:17
I think he called it. And he was like, if I knew, if I
00:15:20
knew what it was, I'd be so excited.
00:15:22
I'd just tell you right now we're figuring it out.
00:15:24
And then I saw that he mentioned that again.
00:15:26
He kind of alluded to that in the recent blog post about
00:15:29
Stargate. I think there's going to be an
00:15:32
announcement later this year is my bet from him on how they're
00:15:36
going to finance it. I can't imagine it's anything
00:15:38
other than like some mix of debt and tying like build outs to
00:15:45
revenue and Rev share or something.
00:15:47
Like there's there's only so many, so many ways you can scan
00:15:50
a CAT. Like I'm not really sure what
00:15:51
the yeah, the magic. There's debt.
00:15:53
Here there's vendor financing and then there's just cash flow,
00:15:57
I don't know what. And like another funny thing he
00:15:59
said at the dinner was like, we would be profitable if we
00:16:01
stopped doing training, which is like, sure, like if you if the
00:16:06
grocery store stopped paying for groceries, it would probably be
00:16:09
profitable. Way more profitable too.
00:16:12
But that was to suggest that there are levers they have in
00:16:15
the business, I think where if they need to start making a
00:16:18
profit, they could. Which if it's research and it's
00:16:21
training, I I don't know if I buy that, but.
00:16:25
That's a ridiculous statement because the reason companies and
00:16:28
whomever are investing at them at that level is because they're
00:16:31
promising AGI. So if they're to say we're going
00:16:33
to stop trying to get after AGI and just like try to coast off
00:16:36
of our consumer grade product, that's not going to give the
00:16:39
kind of returns that SoftBank or any of the number of other huge
00:16:42
companies that have put money into Open AI were expecting when
00:16:45
they when they, you know, took on that equity.
00:16:47
Maybe you can get to AGI through AI Vine with ads.
00:16:51
Yeah. We'll get that.
00:16:52
That's actually, yeah. We're going to jump into that a
00:16:55
bit. What do you, I want to bring up,
00:16:57
Alex, your story about ads because we're talking about
00:17:00
like, I guess you can also just make money on your own as a
00:17:03
company to try to fund these things.
00:17:05
What do you, what do you know about open AI's ads?
00:17:08
How they're going to start shoving ads into our into our
00:17:11
chat? Yeah, this was, I feel like, I
00:17:14
feel like I have to plug things now.
00:17:15
It's really strange. This was the Sources dot news
00:17:17
scoop last week, Fiji Simo, the new CEO of applications at Open
00:17:23
AI, which basically the way it's split up now and I don't think
00:17:26
this is fully understood externally.
00:17:27
It's like Sam is running research safety infra with Greg
00:17:33
Brockman and then Fiji has everything else and that
00:17:37
includes Chacha PT, the consumer business, marketing, basic, all
00:17:40
that stuff that is a normal company.
00:17:43
And she is now looking for someone to run all of
00:17:47
monetization reporting to her. And part of this mandate, and
00:17:51
it's also very clear from the people I know she's talking to,
00:17:55
which includes some of her former Facebook colleagues, is
00:17:58
that this is to do ads, just to put ads in Chachi BT.
00:18:02
I mean, her title is applications.
00:18:03
And that didn't make sense until this week when they released the
00:18:06
Soar app. I think that there will be a web
00:18:08
browser app. I'm sure they'll do others and
00:18:12
the idea is to monetize all that.
00:18:14
And, you know, it's also including the subscriptions, but
00:18:17
it will be ads and they're already thinking through that.
00:18:20
Which is like, that's her bread and butter, right?
00:18:22
She ran Facebook video. She understands that stuff.
00:18:26
She was a pretty critical part of Facebook figuring out
00:18:30
monetization in the very early days, even before she was just
00:18:33
running video. They that kind of work that goes
00:18:38
behind the scenes that no one really thinks about unless
00:18:40
you're in it. And it's kind of thankless work,
00:18:42
but it's what makes the billions of dollars.
00:18:44
It's like ranking optimization, creating like structured data,
00:18:49
connecting the dots. It's very telling that you know,
00:18:52
she just they just acquired Statsig, which was basically
00:18:57
another colleague of hers at Facebook went off and made a
00:18:59
start up to sell the a version of the AB testing software that
00:19:04
Facebook pioneered. That really is what helped it
00:19:08
scale to billions of users and billions of dollars.
00:19:10
That tooling is now internal at open AI and that's going to be
00:19:13
very critical, I think for them to monetize Chachi PT and he's
00:19:18
running engineering now Vija the the CEO of stat stick.
00:19:21
So yeah, the, the, the she's getting the band back together
00:19:24
and it's like there's there's a small group of people that know
00:19:27
how to do this on a scaled platform and that have done it
00:19:30
before. And they either worked at
00:19:32
Facebook or Google. I have to say with the launch of
00:19:35
Vibes and you know, Sora 2's video feed coming back and forth
00:19:40
right there, it does seem like given you know, with also Fiji's
00:19:44
hiring spree, that opening eye is visualizing as far as revenue
00:19:48
goes and profit generation right now coming from competing with
00:19:53
kind of app based products that sort of were Meta's bread and
00:19:56
butter for a while. Like when in the meta printing
00:19:58
money machine it comes from, you know these.
00:20:01
Really addictive consumer applications.
00:20:03
And so it makes sense if you are looking for money right now to
00:20:07
poach the geniuses behind that and build something very
00:20:10
similar. But I think in the race between
00:20:13
which of the foundation model makers are really pushing with
00:20:17
each other right now as we get into the product side, it seems
00:20:19
like opening eyes competing more with Meta directly as a rival
00:20:23
than, you know, maybe Google, which it was at one point.
00:20:25
I think in the model race back in the day with Gemini, it seems
00:20:28
like they were much more neck and neck.
00:20:30
I mean, I think they're competing with both from like a
00:20:33
use case perspective. The ambition's actually pretty
00:20:35
crazy, I think. I think they're going for the
00:20:37
core Google use cases, rebuilding G Suite, an AI
00:20:41
version of G Suite from the ground up.
00:20:44
They're, you know, trying to replicate Google Cloud with,
00:20:47
with Stargate Chechi BT functions a lot that does a lot
00:20:51
of the stuff that you would use search for, which, you know,
00:20:54
everyone knows. But yeah, I think the meta side
00:20:56
is the newer thing. And what explains that rivalry
00:21:00
that we've seen playing out over the last year or so?
00:21:03
They've hired a ton of X Facebook people at Open AI.
00:21:05
And then as we all know, Zuck has gone and spent a lot to hire
00:21:08
a bunch of X Open AI people. So yeah, it's a it's a small
00:21:13
world, but I see them actually competing with both Google and
00:21:15
Facebook. The ad stuff is so interesting
00:21:19
to me and Alex, you and I've been like social media company
00:21:22
reporters for a long time. And you always see this
00:21:25
trajectory that a company goes on where they think they can
00:21:28
like reskin the cat on how to make money and they no one ever
00:21:33
really goes into it wanting to do ads.
00:21:34
And like, famously, you and I both covered famously that we
00:21:38
covered it. But just like the whole
00:21:39
trajectory that's Evan Spiegel went on Snapchat where he
00:21:42
basically tried to go through every other option of making
00:21:46
money off of the app before realizing like, fuck, we got
00:21:48
eyeballs. This is just what you're
00:21:50
supposed to do. Let's just shove ads into it
00:21:53
never really made a great business out of ads, but that's
00:21:55
a separate point. But like, you know, it's, it's
00:21:58
just it's the most derivative and uninspiring way to make
00:22:01
money from an Internet product that has a lot of users.
00:22:04
But you just, the money is there on the table and, and I guess at
00:22:09
some point you realize, like, why are we, why are we beating
00:22:12
ourselves up trying to find so many other options when like
00:22:16
it's different with chat because obviously they do have real
00:22:19
revenue from, from subscriptions and stuff.
00:22:21
But this is where there's a. Ceiling to that there's only so
00:22:25
many millions of people in the world that will pay a
00:22:27
subscription yeah, it's a good business it's billions a year
00:22:30
and they are but look at what they're doing in India, which I
00:22:33
think will soon be their largest market if it isn't already.
00:22:36
They're doing a lower priced subscription because you can't
00:22:41
do a $20.00 a month subscription in India and expect hundreds of
00:22:45
millions of people to buy it. So their, their ability to like
00:22:48
keep eking out incremental gains through subscriptions is already
00:22:51
hitting a wall. So they have to do ads.
00:22:53
I mean, it's the only way you're going to fund this stuff.
00:22:55
It's the only way that. Netflix too, another example.
00:22:57
Yeah, it's the only way that by the way, that Meta and Google
00:23:00
Fund, whatever they're doing, the metaverse, the glasses,
00:23:04
Waymo, Alphabet, all this stuff has been built on the back of
00:23:08
that. It's like literally all the
00:23:09
cool, weird, sometimes failed, sometimes not crazy Moon, check.
00:23:14
Moon shot technology of the last 15 years has been funded by
00:23:18
advertising. Yeah, but they're also good ad
00:23:21
platforms in their own way. Google and and, and Meta slash
00:23:25
Facebook are because when they have really good data and
00:23:28
they're monopolies for the categories that they exist in.
00:23:31
And so like when it comes to search ads, you know, Google is
00:23:34
the only stop shop and they also very effective and done it for a
00:23:37
while. And you know, same with Meta and
00:23:39
and, and Instagram ads. I don't know what the niche will
00:23:42
be for chat. Like they are a monopoly within
00:23:45
consumer, you know, AI chat bots.
00:23:48
So, you know, the money that will exist in that world, they
00:23:52
will probably consume almost all of it.
00:23:55
And then obviously, you know, open AI is going directly like
00:23:58
you mentioned at Google at search.
00:24:00
They are kind of cutting down on search traffic and they're,
00:24:05
they're eating into, you know, the amount of views that, that,
00:24:08
that search get. And so I, I guess their play
00:24:11
will eventually be like, well, if you know, search is getting
00:24:14
less attention, you need to bring, you know, this attention
00:24:16
over to, to, to, to, to chat. But what I don't know is like,
00:24:20
are they going to leverage the data that they need to, in order
00:24:22
to have targeted ads? Is that going to be something
00:24:25
people are comfortable with? Is it like as a user experience,
00:24:28
something that people will, you know, just sort of accept in the
00:24:31
way that they did with, you know, news feed ads or, or
00:24:35
search text ads? I mean it it is going to be a
00:24:37
relatively new experience. I mean, have you guys asked chat
00:24:41
CPT what it knows about you? No.
00:24:43
No, let's do. That no.
00:24:45
I mean, this also came up at the dinner.
00:24:47
I was asking Sam about like these profiles that you have of
00:24:50
people you have like probably the best advertising profile of
00:24:55
an Internet user of any company on earth because of the kinds of
00:24:58
things that you tell Chachi BT. And that data is probably not
00:25:02
structured in a way it needs to. That's the work that Fiji and
00:25:05
this team she's hiring for are going to be doing.
00:25:07
And I expect that to take months.
00:25:09
It's very hard to do at a platform at their scale because
00:25:12
I don't think they had the foresight when ChatGPT was
00:25:14
scaling that they were going to do this.
00:25:16
So they have to kind of go back and rebuild the plane as it's
00:25:18
flying. But there's no question this is
00:25:21
going to be a powerful ad platform.
00:25:23
I mean, look at what they just did with pulse.
00:25:26
The thing it's it's gated right now to the 200 a month pro tier.
00:25:31
But it's basically AAI agent to digest every morning that uses
00:25:36
your history to say and your connectors to Gmail and whatever
00:25:39
and say you should follow up on this e-mail.
00:25:41
This is the the plane ticket, blah, blah, blah.
00:25:44
Here's like a a bedtime story for your kid tomorrow night.
00:25:47
Like Oh my God, that is an incredible ad surface.
00:25:50
Wait, Tom, do you have your profile?
00:25:52
Do you know what? Yeah.
00:25:52
Yeah. Do you want me to read it so I
00:25:54
can let lots know how to, how to advertise?
00:25:56
To me it says you're really into exercise, nutrition, just
00:26:00
letting people know these things.
00:26:01
Yeah, these are sports, writing and work because I have maybe
00:26:06
used it to to Polish a couple of leads.
00:26:10
Personal life. You're from Russia.
00:26:12
Interesting. It says personal life.
00:26:16
You're from Russia, you had a stomach flu recently.
00:26:19
You sometimes like a Jamaican patois accent when we talk about
00:26:22
medication. What?
00:26:26
Wait, is the last? One true.
00:26:28
I don't believe all the other stuff, but please tell me the
00:26:30
last one's true. So I did this is this is
00:26:33
partially true. When I had covet a couple of
00:26:35
weeks ago, I wanted to give it. I wanted advice from it on how
00:26:38
to like monitor or to measure out which like fucking sleeping
00:26:42
medications I was taking. I don't know, I was bored, I had
00:26:44
COVID, I was it was my birthday, I was depressed.
00:26:47
And so I was asking it to tell me like, oh, when should I take
00:26:49
Nyquil? When should I take Advil?
00:26:51
And then in that process I was like, can you also explain this
00:26:53
stuff to me in a Jamaican soir? Did it do it?
00:26:59
Yeah, it did, but I don't think I can read it on air for.
00:27:01
No, you. It actually did it.
00:27:04
Oh yeah, yeah. I'll send it to you later, Alex.
00:27:06
Again, I don't think I could read it out loud, but it was
00:27:08
very funny. I got.
00:27:09
I got good mileage out of that. OK, so at least some of this is
00:27:12
true. I don't know why I decided I'm
00:27:14
from Russia. Are you like what it was wrong?
00:27:17
Because I was very, very specific about COVID, which had
00:27:19
no stomach flu. Are you have you, are you like a
00:27:21
fan of Putin? Have you been like asking a lot
00:27:23
for like pro Putin propaganda or what?
00:27:26
I do love my strong men, Yeah. So it interpreted from that and
00:27:30
then movies and culture and then other bits.
00:27:34
You've asked about products such company accuracy checks.
00:27:36
OK, so I do do some fact checking through ChatGPT to ask
00:27:40
about Broadcom, Oracle, NVIDIA Confluent.
00:27:44
That's the high level sketch. Do you want me to keep going?
00:27:46
No, I don't chat. I don't.
00:27:48
What's yours Alex? Let's.
00:27:50
Go clean, I'm scared to do mine. Madeline, why don't you read
00:27:53
yours? Well I searched mine and it
00:27:54
didn't work very well because I have a separate ChatGPT account
00:27:58
for my personal use and right now I'm only logged in for the
00:28:02
newcomer business account so it only knows things related to me
00:28:05
about business. But it does say, here's what
00:28:09
I've learned about Madeline Rinbarger.
00:28:11
You are an employee at Newcomer. And here's your e-mail and your
00:28:16
Twitter handle. And it.
00:28:18
I joined Newcomer as the publication's first reporter to
00:28:22
cover startups and venture capital and previously was at
00:28:26
Business Insiders. So it knows a lot about my work
00:28:28
history, but it knows very little about what I like to
00:28:30
search. So if I go to my personal
00:28:32
account, that's where the juicy details will really lie.
00:28:34
But. Yeah.
00:28:35
So here's here's mine. You're I see the tech journalist
00:28:38
currently launching your independent media venture.
00:28:40
You're leaving The Verge where you're a deputy editor to run
00:28:42
your publication sources, new companion podcast access with LS
00:28:45
Hamburger. You cover the inside
00:28:47
conversation, the tech industry, AI, big tech executive level
00:28:50
developments. You regular interview top
00:28:52
figures like Mark Zuckerberg, Sam Altman, Dylan Field, Brett
00:28:54
Taylor, Andrew Bosworth. You're setting up a California
00:28:57
LLC. You're evaluating credit cards
00:28:59
and banking. You've been negotiating
00:29:01
distribution and sponsorship deals.
00:29:03
Reach out. By the way, if you're listening
00:29:04
to this, you carefully think about revenue splits,
00:29:07
monetization models, long term growth strategies.
00:29:09
You enjoy cooking, You're into travel.
00:29:11
You like film photography, You play poker.
00:29:13
You enjoy gardening and plants kind of.
00:29:15
I mostly ask about that for my wife.
00:29:17
You like efficient direct answers and then it keeps going.
00:29:20
So like this? Is that would help.
00:29:22
So you're just getting pissed off at it via like you're taking
00:29:24
too. Long, I don't like this, keep it
00:29:26
quick. Like I don't like when the
00:29:28
blowback to four O is happening. I was like, what are people
00:29:30
talking about? Like I don't want this thing
00:29:32
like pretending like it's my friend.
00:29:35
Yeah. But.
00:29:36
So you were, Yeah. No, this is like, this is like
00:29:38
an advertiser's wet dream. Like.
00:29:39
Yeah, it's honestly a lot more detailed than I feel like even
00:29:42
on Meta. I know you can go in your
00:29:44
settings and search how much from your Facebook profile it
00:29:46
can glean about you. And this is, I'd say on par with
00:29:50
that. So in terms of data, they're
00:29:52
kind of neck and neck with Facebook there about what they
00:29:55
can. I would say it's more, I would
00:29:57
say it's more because it's like, it's like search, it's an intent
00:30:00
based environment. Whereas meta is having to infer
00:30:04
based on more passive interactions, scrolling, liking,
00:30:08
sending something. Those are like, they're not.
00:30:11
They're implicit, they're not explicit.
00:30:13
Whereas you're literally telling Chachi PT what you want, what
00:30:16
you're thinking about, your deepest, darkest secrets.
00:30:19
So yeah, the, the opportunity financially is tremendous.
00:30:21
The responsibility is even greater.
00:30:24
And well, have you? Gotten a sense, you know, very
00:30:28
early days about like where they're trying to put guardrails
00:30:30
in on how data is used. I mean, I asked, I asked Sam
00:30:33
about that at the dinner, and he was like, yeah, it's a huge
00:30:35
responsibility. We think about it, but like,
00:30:37
we'll figure it out. Yeah, by the way, you guys can
00:30:40
read it later, but I threw in my Jamaican Patois answer into the
00:30:43
chat again. I do not feel comfortable
00:30:46
reading it out loud. It should be the cold open.
00:30:50
Yeah, yeah. Well, I tried to get the voice
00:30:52
to do it but. Good God.
00:30:55
Yeah, no. Good thing we didn't read this.
00:30:57
Yeah, I can't. We've all seen the Adrian Brody
00:31:00
SNL clip, and as much as he does resemble me in some ways, I
00:31:05
don't want to. I don't want to experience that
00:31:07
level of cancellation. You don't want to do a Ross
00:31:09
Trent bit here so. Let's talk about let's talk
00:31:13
about what's going on in our socials these days with, with
00:31:16
Sora 2 and and and meta vibes. Alex, I've seen you post some of
00:31:22
your some of your slop on the feeds.
00:31:27
The slop factory mining away. Madeline, you got an invite
00:31:32
code. Have you, have you played around
00:31:33
with it yet? I did, yes.
00:31:35
I just started playing around with it.
00:31:37
So I was, I was generating myself kind of gardening some
00:31:42
strange plants. I think I was on a little
00:31:44
shopkick there for a bit, but I have not.
00:31:48
I have noticed that the guardrails are pretty intense
00:31:51
about what it can or cannot generate.
00:31:52
Like I tried to get it to generate me jumping off of the
00:31:57
Empire State Building onto a Phoenix and then flying away and
00:32:01
it will not let you use the prompt jump off a building.
00:32:04
It will not let you generate that.
00:32:06
So it actually the content moderation is pretty limiting
00:32:09
honestly, from what I found a lot of stuff to do cuz avoiding,
00:32:12
you know, they're really sensed about like the self harm kind of
00:32:14
like dangerous images stuff. But I found that interesting
00:32:17
like. You can get around it, though.
00:32:19
Like I, yeah, I asked it to make me look like Superman.
00:32:22
It wouldn't do it. And then I was like, you know,
00:32:23
red superhero with a Cape flying.
00:32:26
And it was like a mix of soups and Avengers.
00:32:28
I mean, the the IP theft is, is incredible.
00:32:31
There was a Wall Street Journal story that they're basically
00:32:34
telling all the studios you have to opt in to not have your IP
00:32:38
featured in this. So I've seen like straight up
00:32:41
Pokémon. The Pokémon and and Mario and
00:32:45
Luigi. Rick and Morty.
00:32:46
It's struck a deal with Nintendo.
00:32:48
I've seen a lot of SpongeBob too.
00:32:50
Yeah, I mean, Rick and Morty, like there's no way they got all
00:32:53
these deals. There's no way.
00:32:56
So we're just betting that this will play out how it has with
00:32:59
all the text based copyright AI stuff, which is like years of
00:33:03
litigation, some settlements and by the time it really gets to
00:33:07
like a ruling war on like a GII mean, I think that's the play.
00:33:11
Classic tech industry play, you know, just, like, basically
00:33:14
break the law or, you know, get to a point where you get sued
00:33:16
for allegedly breaking the law and then pay a speeding ticket.
00:33:19
Yeah. And then everyone kind of moves
00:33:20
on. It's like YouTube, Viacom all
00:33:22
over again. I mean, you remember.
00:33:23
All Yeah, Yeah, right. And we saw what happened to the
00:33:26
industry after that. I mean, it does get fairly
00:33:27
depressing if you're someone who cares about that stuff.
00:33:30
But the thing that I guess everyone is trying to figure out
00:33:35
right now and I'm on the fence about is like, is this really
00:33:40
that big of a moment, this Sora two thing that we're seeing?
00:33:43
Or is it just, you know, a fad that dominates our algorithms
00:33:49
for a couple of days, a week or two, and then it just kind of
00:33:51
fades into the background as another kind of cool feature.
00:33:55
And it's hard to have this conversation, it feels like,
00:33:57
because we're in the midst of this deluge of all of this, you
00:34:02
know, just being overwhelming. That's basically all I see on my
00:34:05
fucking feeds right now. I mean, where do you stand on
00:34:08
this? Because I think you and I are
00:34:09
going to come out on different places.
00:34:10
Alex The. Top three apps the.
00:34:11
Maximalist argument. The top three free apps in the
00:34:13
App Store right now in order are Gemini, Chachi, PT and Sora.
00:34:17
So that's within like 24, not even 24 hours and.
00:34:22
It's not even public access yet. It's not public access.
00:34:25
The reason it's going to work and the reason it is a success
00:34:29
and meta vibes was a slot failure is the cameo thing is
00:34:34
being able to feature your friends opt into sharing your
00:34:38
likeness with other people. Sam Altman opted into sharing
00:34:41
his likeness with everyone on the app so he's like the main
00:34:43
character. Like you cannot.
00:34:45
If you scroll through the feed it's just non-stop like Sam
00:34:48
Altman's GPU. Jokes.
00:34:49
Oh yeah, Alex, I was. I was joking with Tom earlier
00:34:51
that half of my Sora feed is Sam Altman and the other half is
00:34:55
Jesus. So.
00:34:56
Yeah, and like he wanted that, clearly.
00:34:58
And like, I think you know, all these Tuxeos love attention and
00:35:01
I'm sure Zuck is fuming that he's not like the main character
00:35:04
of. Vibes could you have imagined if
00:35:05
this the shoe was on the other foot and Vibes had a cameo
00:35:08
version and you know Zuck, it should have himself up to the
00:35:11
front. People would like it would.
00:35:12
People like that, though. People have been meaner about
00:35:15
it, you know, because people just have a natural revulsion to
00:35:17
Zuck, either his personality or the.
00:35:19
Technorati books, The Technorati books but normal people would
00:35:23
like. Again, who's to say because it
00:35:26
didn't happen, But like, Sam kind of has this, you know,
00:35:29
dorky philosopher image that people, you know, lightly make
00:35:32
fun of, but you don't see this level of pure hatred.
00:35:34
Whereas suck, you know, really does bring a darkness out of
00:35:37
people because of the way he is. I know you.
00:35:39
You know you're closer to him than I am as a, you know,
00:35:42
interview subject, but you think this could have happened with
00:35:46
the same exact situation with with Zuck in the Sam position
00:35:49
and and vibes just had this cameo feature.
00:35:51
I don't know if the exact thing. I mean like chachi BT has such a
00:35:55
Halo. Like the vibes are good.
00:35:57
Everyone wants to know what open eye is doing.
00:35:59
Like they have the they're the belle of the ball.
00:36:02
Like anything they do is going to get the most attention.
00:36:04
So sure, it's not the same. At the same time, I think, and I
00:36:08
wrote about this in sources last night, is like, I think Meta
00:36:11
fundamentally misunderstood that people.
00:36:15
It's not that people don't want AI slop, it's just they want AI
00:36:18
slop with their friends in it. And I think the reason people
00:36:21
were. Ironic for the social media
00:36:23
option you. Know the media company for the
00:36:25
friend company? Yeah, for the social media
00:36:27
company to not understand this, right.
00:36:29
And they clearly rushed it out. I actually know this.
00:36:31
They rushed it out because they knew that Sora was coming, the
00:36:35
Sora app was coming and they wanted to be first.
00:36:37
And they're not even using like custom models that are using Mid
00:36:39
Journey and Black Forest. And it was very hacked together.
00:36:43
Like the they tried to get me to write about it and and gave me
00:36:46
access. I think it was the first person
00:36:48
to get access like before it launched and like the metadata
00:36:51
was broken. Like it was like text strings on
00:36:54
the captions that weren't England.
00:36:55
Like, like didn't make sense. Like it was so rushed together
00:36:59
and you saw the dunking on Twitter about it because it was
00:37:02
it was, it was like there was no, there was no humanity to it.
00:37:06
It was just like all these crazy fantastical.
00:37:09
It was like deviant art on steroids.
00:37:11
And like there's a market for that.
00:37:13
That's fine. Like that is an interesting
00:37:15
idea, but. It's it is the news feed to a
00:37:18
degree at this point. Kind of.
00:37:19
I mean, the news feed at least still has people in it.
00:37:21
It may not be people you want to see or it may increasingly be AI
00:37:24
characters, but like the meta vibes feed is like a dog that
00:37:27
looks like a Unicorn, you know, flying like into hell.
00:37:31
Like it's all this crazy stuff and like Sora is like I this is
00:37:35
what I call it. It's like it's AI vine.
00:37:37
It's like completely what it is. It's like vine but AI generated.
00:37:40
It's funny, it's light hearted, it features your friends faces,
00:37:44
you know, like that's the better approach.
00:37:47
Right, well, so, but then this brings us to the question,
00:37:50
'cause you make the Vine comparison.
00:37:51
I mean, that obviously was a phenomenon in its day.
00:37:55
Gets bought by Twitter, they kind of kill the whole thing and
00:37:57
and you know, Tiktok comes along to basically take short form
00:38:00
video and make it into a massive industry.
00:38:03
But how do you know that what's happening right now as we're in
00:38:06
the heat of this moment isn't just going to be another Studio
00:38:10
Ghibli moment or, you know, Facebook filters moment or MSQRD
00:38:14
or any sort of thing that people kind of enjoy in that particular
00:38:18
moment. It's fun.
00:38:19
You can put it out there, you get some attention for it.
00:38:21
And then it just was like a toy that you put back onto the shelf
00:38:24
because it didn't fit intrinsically into the way you
00:38:26
really want to express. Your I mean.
00:38:28
Studio gets hacky. I mean, Studio Ghibli added like
00:38:31
100 million users to ChatGPT, so I don't really to buy the
00:38:34
argument that that was like a fad that like didn't matter.
00:38:36
Well, but are these people using I'm, I'm not saying it won't
00:38:39
have benefits to open AI or to ChatGPT, but I'm saying as a
00:38:42
medium as a, as a mode of expression.
00:38:45
Again, it's only based on what I consume through my socials.
00:38:48
But like I don't see the Studio Ghibli thing happening anymore.
00:38:50
It was a couple of weeks, which is more than I thought it was
00:38:53
going to be. But like, you know, does this,
00:38:56
you know, is this just a, a feeding mechanism for more
00:38:59
people to sign up for ChatGPT? Yeah.
00:39:01
Or, or I guess Sora in that. Or do you think like this is
00:39:04
going to be sustained as a, you know, mode of expression as
00:39:07
another way to people to create shit?
00:39:09
Who knows if it will be sustained, but I think it
00:39:11
clearly has, as they say, product market fit within the
00:39:15
1st 24 hours, which we can't say that Metavibes did so.
00:39:19
Oh yeah. You know, it's #3 on the App
00:39:22
Store, it'll probably be #1 at some point this week.
00:39:25
It's not even open to the public.
00:39:28
It's just funny. Like I've laughed when I'm on
00:39:31
it, which is like something you will never say using that AI.
00:39:35
And I think that says a lot. Like there's something about AI
00:39:38
that is AI content that is silly and jokey and featuring people,
00:39:43
you know, and their faces that is just way more approachable
00:39:46
and feels like oddly human in a really strange Black Mirror way.
00:39:50
But it's something you can't, it's like you can't look away
00:39:52
from. At least that's my experience.
00:39:53
So it's like, I mean, it's slop. Like it's not.
00:39:56
I'm not pretending that this is like, you know, the same thing
00:39:59
as like solving, you know, cancer with AI, but.
00:40:03
I mean. Or making a real, you know,
00:40:05
movie or or TV show or something.
00:40:07
I mean, that's yeah, this goes off the branches off in a whole
00:40:10
different discussion. But you know, anytime a tech
00:40:13
video creation tool goes off in the way this one has, you do
00:40:16
have people out there being like, oh, Hollywood's fucked.
00:40:18
Oh, this is the only way that we're going to be making movies
00:40:20
in the. Oh, it was always fucked.
00:40:22
Yeah, I also just want to disregard that point.
00:40:24
It's just a whole different conversation, but.
00:40:26
Yeah, short form video isn't comparable to like cinema, Yeah.
00:40:30
Yeah, it is its own thing. And like film festivals have a
00:40:32
separate track for AI video, like they do consider it its own
00:40:35
genre. For now, right now, for now, for
00:40:38
now. But yeah, as a creator, you
00:40:43
know, sorry, not myself as a creator, but as someone who's
00:40:45
like, played around with this stuff.
00:40:47
You're a creator, Tom. Don't sell yourself.
00:40:49
Am I? Yeah.
00:40:50
Am I? I don't know.
00:40:51
I like to consider myself a philosopher influencer.
00:40:57
I didn't have as much fun as I thought I was going to have on
00:40:59
it. I don't know.
00:41:00
I made a couple of videos. Most of them were me and Sam
00:41:02
just hanging out, just sort of fulfilling my fantasies.
00:41:05
But. Please do tell.
00:41:08
It just, we were, you know, talking, we were podcasting.
00:41:11
That's all I really think about. And you know, I, I did one video
00:41:15
where it was me and Sam and we were, I turned into a mech and I
00:41:19
wanted to stomp him actually, like I was going to be a giant
00:41:22
mech and stomp him. But I, I I typo and I said stop
00:41:24
him and it made it sound like I was trying to stop him from
00:41:27
achieving AGII did a video of me like as a as a, as a man.
00:41:32
This was not creative stuff. This is just my fault I guess.
00:41:36
I was like a, a, a, a you got. A drink.
00:41:37
I was in AI, was in a western. I was coming into town and I was
00:41:40
like, kind of a like, like a baddie from the good, the bad
00:41:43
and the ugly. And I was like, you know, where
00:41:46
are all the tech reporters? I want them out here and I want
00:41:48
them gone. That's a good.
00:41:51
I don't know. Are you happy with your?
00:41:52
Are you happy with your? Your sort of.
00:41:53
Video I need to think about it more.
00:41:55
I mean, like you, I struggle with creativity, but you know,
00:41:58
it's the thing that the thing that it works about it is the
00:42:01
remixing. Like, you know, Katie and the
00:42:03
topless, who I would say is much better at this, will just like
00:42:06
make something because I've allowed anyone to use my
00:42:08
likeness and then I can remix it.
00:42:11
I can be like, oh, that's an interesting idea.
00:42:12
I want to like put myself in a GPU jail as I'm reading the
00:42:15
newspaper about Sora hitting number one in the App Store,
00:42:17
which is something I did this morning.
00:42:19
And like, that's fun. That's like no other platform
00:42:22
you can do this on. Like at least it's new.
00:42:24
It's probably going to destroy society in some way.
00:42:28
Yes. It's like I there's a really
00:42:30
good argument that like, are we lighting up data centers and
00:42:33
poisoning the climate to create the slop?
00:42:36
Like, there's a lot of good arguments about why you
00:42:37
shouldn't do this. At the same time, like, I
00:42:40
laughed last night. And you know what?
00:42:41
Like in launching a company, I haven't had a lot of time to
00:42:44
laugh. So that was nice.
00:42:46
As we saw from from your chap GPT records, none of them had
00:42:48
anything to do with with mirth. There was not you, you Alex,
00:42:52
either a mirthful person. I, I guess maybe like to close
00:42:56
off this, this section of it like you're basically already
00:43:00
described it as a success. Like it it's it's high enough on
00:43:03
the App Store. It's gotten enough attention on,
00:43:06
you know, our socials that this did everything that Sam wanted
00:43:09
it to do. I mean, resounding, like, day
00:43:12
one success. Yeah.
00:43:13
I mean, I think like Meta, the largest social media company in
00:43:17
the world that can like promote things on Instagram, Meta Vibes
00:43:21
was not in the top 20. Anything.
00:43:23
Like, no. So the vibes are the vibes are
00:43:27
changing, as they say. Yeah, the vibes are off.
00:43:30
What does Google need to do to respond?
00:43:32
I've been having a lot of fun with Nano Banana.
00:43:34
Yeah, I mean, they're #1 tool. And I think all this is showing,
00:43:37
by the way, that like, people just want this AI stuff to help
00:43:39
them make things and like create and be expressive.
00:43:42
And honestly, reminds me of when we covered Snap Tom and they had
00:43:45
all those viral moments with the face filters, which really
00:43:48
propelled Snapchat through a lot of key growth moments.
00:43:51
It was like face swap, the crying thing, like all of that
00:43:57
was like, it was new at the time and people ate it up and it was
00:44:00
viral. And it's great for a social
00:44:02
product. And like, Gemini's going through
00:44:04
that right now, they're #1 nano bananas, crushing, and then
00:44:08
Chachi, BT is number 2, and swords #3 and it's like those
00:44:11
are the three tools right now that are letting people do weird
00:44:14
new things with AI that are expressive and weird and fun.
00:44:18
Yeah, the fun thing with Snapchat was always so
00:44:22
interesting with with Evan Spiegel because he had this
00:44:25
commitment to saying Snapchat as a toy, we are not trying to, you
00:44:30
know, have effects on society. He would get a little off, you
00:44:33
know, that that mission at times because he would say this is
00:44:36
like essential to how young people communicate and we are
00:44:39
giving in the platform to express themselves in ways they
00:44:42
wouldn't be able to otherwise. But and literally in the S1, he
00:44:45
was saying like, this is a toy. You know, Sam has to sort of
00:44:48
balance the like, we have made a fun toy with the like, yeah, we
00:44:51
are curing cancer and this is going to have huge job
00:44:54
displacement. And, and I don't think he's
00:44:57
figured out the message yet on that.
00:44:59
Exactly. Like I saw this morning, he was
00:45:02
trying to like, you know, split the baby on like, well, it is
00:45:05
important that we do these things because we need to show
00:45:08
off what our technology is capable of doing along the way.
00:45:11
And then also we'll do the AGI thing.
00:45:13
It's all a step towards that. Yeah.
00:45:15
I mean there's that's a debate inside Open AI.
00:45:18
I mean, you've got a lot of people who join these labs to
00:45:20
invent AGI and to, you know, increase GDP and create like
00:45:26
amazing productivity software. And now they're being told to
00:45:28
create like AI TikTok. So you're going to have a
00:45:31
culture clash for sure. Yeah, yeah.
00:45:34
And and it, it makes me wonder, you know, because there are
00:45:36
companies out there that are still dedicated.
00:45:38
I mean, basically Ilya Soskvar's company is out there saying like
00:45:42
it's AGI or nothing. And I mean, you, that's
00:45:46
interesting. But about this internal debate
00:45:48
there, like I got to think that's a good recruitment tool
00:45:50
for for SSI, right? For sure, basically.
00:45:53
Say like. Even I was going to say there's
00:45:55
even a bunch of other startups jumping into this space too.
00:45:58
Periodic Labs came out of stealth on Tuesday and they're
00:46:02
coming around. You know, they've, it was
00:46:04
basically a lot of the scientists who defected from
00:46:06
Open AI and DeepMind who wanted to be using this tech to, you
00:46:10
know, look at the genome and discover new chemicals and do
00:46:13
drug discovery and scientific discovery.
00:46:15
And they basically, in my opinion, that's like, oh, there
00:46:18
was this mass defection of some of these researchers who are
00:46:22
motivated by meaningful work, quote, quote, you know, starting
00:46:25
their own company. So it does seem like people are
00:46:27
just doing this in a way where they maybe will go out and start
00:46:30
their own business. Yeah, No, I think this will be I
00:46:33
mean, this is a good story, wink, wink.
00:46:35
But like that this culture clash of AI people building the stuff
00:46:39
who want to build AGI science breakthroughs, curing cancer.
00:46:45
And I would put like Demists in this camp, I would put Iliasa.
00:46:50
Certainly thinks he is over at. Microsoft, I don't actually know
00:46:54
what the slop is. You should have him on the pod
00:46:56
and then and then I don't know if he's.
00:46:57
If he's dying to talk to me, but.
00:46:59
Yeah. And then Zach and then I guess
00:47:01
Sam now to a degree who are like, oh, we can also do this
00:47:05
slop and we can do like social and we can make this very
00:47:09
consumery because we have to make a business in the near
00:47:11
term. That's a culture clash for sure.
00:47:14
And that's fun to cover. Yeah, and it's also it it it
00:47:17
speaks to the moment that we're in where you have this whole
00:47:22
group of young people younger even than than than than you,
00:47:24
Alex, who are part of this AI, you know, San Francisco AI
00:47:28
revolution, who are, you know, these extremely grind set don't
00:47:34
even date. They're all they believe is that
00:47:37
they're being part of some sort of socially transformative
00:47:40
technology. And that's the only thing that
00:47:43
they talk about. Like, I don't know if that's the
00:47:45
entirety of the movement, but it's a it's a it's a core part
00:47:47
of it. And you know, that's what's
00:47:50
gotten them excited. I don't know how skillful they
00:47:52
are. I don't know how important
00:47:53
they're going to end up being to all of this, but like.
00:47:55
Those people are you're never going.
00:47:58
To have anything the 996 kids, I mean, they they don't they they
00:48:01
there was a great line. There's this New York magazine
00:48:03
article about them the other week that was really great.
00:48:05
That was just like, most of them aren't dating, but almost all of
00:48:07
them are hiring and like, what's the difference?
00:48:10
That's how they view social interactions.
00:48:12
And like for them, Sora 2 is not that is not the off ramp they
00:48:15
want. I don't know.
00:48:16
I mean, we have the, we have the Friends CEO on Access this week,
00:48:20
the Access podcast, who's going viral for all those billboards.
00:48:25
Oh yeah, the. Subway ads, I've seen all those
00:48:27
AI pendant that you wear that your friend and he's like 22 and
00:48:32
is exactly the archetype you're talking about, Tom.
00:48:34
And it was really interesting to talk with him like he's already
00:48:37
like, how do I build my company for AGI?
00:48:40
Like a lot of these people that are that age and coming up and
00:48:43
starting companies, they, there's a phrase for this.
00:48:47
It's escaping me, but they're basically already like they're,
00:48:50
they're rushing to build before you know, none of this matters.
00:48:54
Oh yeah they're like if AGI comes by 2027 you only have like
00:48:58
2 years to amass generational wealth or you're the amassing
00:49:01
wealth thing and. Like, yeah.
00:49:03
And that really came through for me with Avi.
00:49:05
Like I think there is this like like fervor and like insane
00:49:11
drive to do as much as possible in a short amount of time and
00:49:15
like open AI basically. Collect.
00:49:17
Personified. Collect as much before none of
00:49:19
it matters anymore. Just like fill your bag with
00:49:22
lucre before the whole fucking thing.
00:49:23
Just like goes goes under. Yeah, what a fun time to be
00:49:27
alive. The perfect place.
00:49:30
To end the episode. And I got a podcast plug in.
00:49:34
Look at that. Yeah, yeah.
00:49:36
Well, by the way, I mean, since we're we're winding down here
00:49:39
and you've mentioned it a couple times, you've done a good job
00:49:41
plugging yourself, but is there anything you want to mention?
00:49:44
I mean, we didn't really tee it up in the episode, but like, you
00:49:46
recently left The Verge, you know, you are a creator and an
00:49:52
influencer and maybe. You've joined us in newsletter
00:49:55
World Independent Media. Yeah, I'm on sub Stack sources
00:49:59
dot news, the Access podcast with Ellis Hamburger and Vox
00:50:03
Media, A former Snap employee, Tom.
00:50:05
Former SNAP employee. Sorry, Madeline, you're probably
00:50:07
like, why are these guys talking about Snap?
00:50:09
Like, I don't think about Snap at all.
00:50:11
And it's because we, no two people on earth have devoted as
00:50:15
much reporting, thinking power to Snap as Tom and I.
00:50:18
So of course we're going to talk about it on the spot.
00:50:20
The amount of times that I on vacation felt it necessary to
00:50:23
discuss people that like, oh, you know, there's a shake up in
00:50:26
the growth group. It was a simpler time, you know.
00:50:29
See, I love that I love harkening back to the the social
00:50:33
media awards days is quaint for me is that reporter during.
00:50:36
The AI, it was like talking about like the young kids, you
00:50:39
know, running, you know, the, the, the AI scene.
00:50:41
And certainly like young reporters, they do not
00:50:44
understand and give a fuck about the Snapchat days.
00:50:47
It is of 0 interest to them. I was like, man, you don't
00:50:49
understand. This was the open AI of its
00:50:51
time. Like the chaos within that
00:50:52
company mattered so much to people.
00:50:54
And now I just sound old. We're old, It's OK.
00:50:59
We're old. It's OK.
00:50:59
You're not old. Well, you can keep generating
00:51:02
Sora videos of youth. It'll be great.
00:51:04
Yeah, All my prompts are like at Tom D'oton, but younger Sam
00:51:09
Ultimate. Yeah.
00:51:10
Yeah, it is. Very.
00:51:11
We're beautiful. It is very weird to be meeting
00:51:13
founders who like are have businesses with millions in ARR
00:51:17
and it's like, oh, you're 10 years younger than me.
00:51:19
Yeah, it does make you feel really old.
00:51:21
But yeah, no. Anyway, I'm sources, not news.
00:51:24
Check me out. And yeah, thanks for having me
00:51:28
guys. This is fun all.
00:51:29
Right. Thanks for doing it, Alex.
00:51:30
Yeah. Thanks again for tuning into
00:51:32
this episode of the Newcomer Podcast and shout out Alex Heath
00:51:36
for joining us. That was a lot of fun.
00:51:37
If you're new here, please like and subscribe.
00:51:40
It really helps out our channel and listen to new episodes every
00:51:43
week wherever you get your podcasts.
