AI stocks are booming — but are we nearing a breaking point?This week on Newcomer, we unpack how OpenAI, Anthropic, Nvidia, and AMD are fueling what might be the biggest tech money machine since the dot-com bubble. From trillion-dollar valuations to volatile deals and hype cycles, we explore whether this “AI economy” is sustainable — or if it’s starting to look like a Ponzi scheme.
00:00:00
Are AI stock valuations hitting all time highs and is this never
00:00:03
ending cycle of announcements and gains unsustainable?
00:00:06
This week we saw another round of significant partnerships
00:00:09
between Open AI and a major chip maker, Open AI making a deal
00:00:12
with AMD on data centers. In today's episode, I'm joined
00:00:16
by Madeleine Renberger to discuss how this dynamic is
00:00:18
evolving and just how much Open AI is propping up public tech
00:00:21
stocks with the promise of future demand.
00:00:23
Later in the episode, Madeleine shares are dispatched from
00:00:26
Anthropic. Clots pop up in New York.
00:00:28
Hot AI labs are taking a note from fashion hype beasts, but is
00:00:31
it really going to translate into new business?
00:00:33
Finally, we're going to talk about a story I wrote about how
00:00:35
Hollywood's failure to adopt AI video generation products is a
00:00:39
sign that companies like Open AI will be shifting away from
00:00:41
enterprise solutions. This is the Newcomer podcast.
00:00:53
This podcast is supported. By Google.
00:00:56
Hey folks, Steven Johnson here, Co founder of Notebook LM.
00:00:59
As an author, I've always been obsessed with how software could
00:01:02
help organize ideas and make connections, so we built
00:01:06
Notebook LM as an. AI first tool.
00:01:08
For anyone trying to make sense of complex information, upload
00:01:12
your documents and Notebook LM instantly becomes your personal
00:01:16
expert, uncovering insights and helping you brainstorm.
00:01:19
Try it at Notebook lm.google.com.
00:01:22
All right, so last week, Madeline, we had Alex Heath on
00:01:25
and there was this thing kind of stuck in my head from that
00:01:27
conversation, which was that at a dinner that he went to that I
00:01:31
was not invited to he. Tom was not invited to the
00:01:36
dinner. I was not invited to the Sam
00:01:37
Altman off the record, no on the record dinner because.
00:01:40
You got Tom not invited. Yes, if I brought it up multiple
00:01:44
times on the show now you know, Sam, I guess was bragging after
00:01:48
a question or or hinting at teasing your question.
00:01:52
I think Alex had asked him, which was that, like, how are
00:01:54
you going to pay for all of these, all of these deals that
00:01:58
you guys keep announcing? Like, where's the money going to
00:01:59
come from? And it sounded like Sam kind of
00:02:01
had like a panicked but excited glow in his eyes as he
00:02:05
responded. Like, I think I figured out a
00:02:06
new instrument, a new way to fund all of these deals.
00:02:10
And we sort of laughed at the time.
00:02:12
We were like, oh, maybe it'll just there's not that many ways
00:02:14
to fund these things. There's raising debt, there's
00:02:16
selling equity, and then there's vendor financing and I guess
00:02:19
rate. Yeah, it's.
00:02:19
Yeah, selling equity. It appears that he may have
00:02:23
actually figured out a version of kind of sounds like vendor
00:02:27
financing, but we may be witnessing what the instrument
00:02:29
is. And that's this AMD deal that
00:02:33
came out on Monday. And basically what this deal was
00:02:38
about is that AMD is agreeing to give 6 gigawatts worth of chips
00:02:43
to to open AI worth 10s of billions of dollars.
00:02:50
And in response, in return, Open AI is like given warrants.
00:02:53
So they could purchase up to 10% of the equity of AMD if certain
00:02:58
performance targets have met. That to just to for clarify that
00:03:01
adds up. It's a, it's an, a stock option.
00:03:03
Basically, they get the option to buy up to 160 million shares
00:03:07
of AMD. Right, right, which ends up
00:03:09
equaling out to 10% of the company and that's based on
00:03:13
performance targets that are met and whatever.
00:03:15
But after this deal was announced, AMD stock shoots up,
00:03:19
as has been the case for almost all of these open AI deals in
00:03:23
the last, you know, couple of weeks, Sam and ACEO of a chip
00:03:27
company make an announcement, the stock shoots up and the the
00:03:29
deal kind of pays for itself in terms of increase equity value.
00:03:35
Man, I don't know what to make about these things anymore.
00:03:38
It's really gotten to a point now where, like we talked about
00:03:42
last week in the story that I did, that like Sam has
00:03:45
infiltrated and like integrated himself into like the core
00:03:49
structure of all of these publicly traded companies
00:03:52
because of all the deals that they keep announcing.
00:03:55
How much higher can it get? How much longer can this
00:03:57
particular strategy keep going on?
00:03:59
That is just I think that's the multi trillion dollar question,
00:04:03
Tom, right? Like I these deals as they
00:04:06
function now, they seemingly juice the stock market and
00:04:10
everyone's happy and the the announcement from open AI
00:04:14
partnering with your company is enough to then fund said
00:04:17
partnership. So it's great and everyone's
00:04:19
happy, but it just seems to be adding a lot and lot more risk.
00:04:23
And that doesn't mean that there's going to be an
00:04:24
inevitable crash if some of this doesn't workout, but it does
00:04:28
just seem like all of the public tech company stocks that have
00:04:32
the ability to partner with open AI that are doing so that's that
00:04:37
is basically keeping us afloat right now.
00:04:39
Those partnerships are what are landing.
00:04:42
It's basically, I won't even say this is a new point.
00:04:44
This is just like we're hitting our point from last week again
00:04:46
on the head with this new AMD deal announcement.
00:04:49
It's like just when we thought it couldn't get any deeper, we
00:04:52
get another deal and Sam sparks this other point.
00:04:55
And obviously Nvidia's wrapped up in a lot of these deals too,
00:04:58
especially with Oracle, you know, their Oracle spends, you
00:05:01
know, billions, 10s of billions on NVIDIA chips to run it's
00:05:04
Centers for all this compute. So it's not solely open AI.
00:05:07
NVIDIA is very tied into this as well.
00:05:09
But I think it's it's, it's, it's really these two.
00:05:13
Yeah, well, it's interesting. Well, and, and Broadcom and the
00:05:15
chips that they're going to be building on their own that
00:05:17
supposedly will fit somehow into their, you know, infrastructure
00:05:21
play as well. But you know, the, the the
00:05:25
NVIDIA deal was really this example of Sam and and Jensen
00:05:29
getting super close and them basically intertwining their
00:05:33
fates even more so. And we refer to it as some sort
00:05:35
of like, well, I didn't use the term, but, you know, it's kind
00:05:38
of like a mutual destruction pack across the tech industry,
00:05:41
but certainly between those two. But then the AMD deal was
00:05:45
interesting because, you know, it is a competitor.
00:05:48
They are going to be developing chips that they would be using
00:05:50
in lieu of Nvidia's GPU's. And NVIDIA stock fell a bit when
00:05:56
the news was announced. And so, you know, as close as
00:05:59
these guys are and as important as, you know, each company is to
00:06:03
the other is future, you are seeing at least some
00:06:06
diversification on on and open a eyes part that makes sense.
00:06:12
You know, from one perspective, like, like it'd be kind of silly
00:06:15
to place all your bets on just your one this one supplier of
00:06:18
this critical component. But you know, like, I don't know
00:06:22
if you read the articles or saw the interview, didn't have a
00:06:24
chance to like Jensen was on CNBC and he was like, yeah, that
00:06:28
was interesting. I didn't I didn't know they were
00:06:30
going to do that. Well, it's a very creative
00:06:33
financing deal that they put together.
00:06:36
Like that John Mulaney bit that's like, I didn't know, knew
00:06:38
how to do that. You know, like the horse in the
00:06:42
hospital. Look, we're all like making this
00:06:44
stuff up in real time, how to finance things at this scale.
00:06:48
And you know, it's funny that Jensen, I, I didn't see the
00:06:51
video, so I couldn't detect, you know, if there was any sort of
00:06:54
derision in his, you know, demeanor on the AMD open AI
00:07:00
deal. But like, let's not throw stones
00:07:03
here from the glass house, Jensen.
00:07:06
Like you have constructed an extremely circular deal with
00:07:10
open AI in which you are giving them, you know, effectively or
00:07:13
there's a letter of intent to give them $100 billion in
00:07:17
exchange for them using $100 billion on your chips.
00:07:20
I mean, the circularity of every single deal that has been
00:07:23
announced here is on a different level.
00:07:26
Like this is a whole economy that is being propped up by
00:07:30
circular deals. I mean, like there's some
00:07:32
interesting charts that have come out.
00:07:34
Maybe we can flash it on the screen for those watching the
00:07:36
YouTube video. But you know, the the
00:07:39
relationship between Microsoft this this goes back to the
00:07:41
beginnings of open AI, right? Like Microsoft gives open AI
00:07:45
billions of dollars that that ends up being used as an Azure
00:07:48
credits, which which props up Microsoft stock for a while.
00:07:51
There's obviously NVIDIA stuff, there's the AMD stuff.
00:07:54
The Oracle thing isn't quite circular.
00:07:57
That's a pretty standard client, you know, provider relationship.
00:08:01
Right. It's 300, it's a, it's a massive
00:08:03
deal, but it's for cloud services which they provide, so
00:08:06
you know. As Alex mentioned on the show
00:08:07
last week, like they are providing, you know, basically
00:08:10
Sam put the ball into Larry Ellison and Oracle's court and
00:08:14
said, like you guys figure out how to raise the money that
00:08:17
it'll take to to build the things that you're asking for.
00:08:19
So I mean, I don't know, I guess we're back at the question that
00:08:21
we were at last week, which we don't have any more of an answer
00:08:23
to, which is like, how much longer can this go on?
00:08:28
How many more of these deals can open AI keep striking and the
00:08:31
markets, you know, keep responding in kind and saying
00:08:34
like, great, let's keep pushing up the value of these stocks
00:08:37
that you make the deals with. I mean, is, is that literally
00:08:39
all it is? Like, is it just like deals for
00:08:42
the sake of announcements that can prop up a stock and then
00:08:44
like kind of keep the machine like running?
00:08:48
I mean, we'll have to wait and see, honestly.
00:08:50
Yeah, I don't. I, I wish I had more info for
00:08:52
you. I'll dig around for a scoop if I
00:08:54
can. But yeah, it's it's kind of like
00:08:57
we're all in this waiting game together.
00:08:59
Yeah. And like, I don't know, credit
00:09:01
to Sam throughout throughout all this stuff like as as absurd as
00:09:04
as it is as a reporter and you know, like anyone with I think a
00:09:08
grounding and common sense that you can kind of continue to fund
00:09:11
stuff based essentially on like future returns.
00:09:15
That seems to be the only way to operate at the scale that he
00:09:17
wants to operate, right? Like this is a this is AAI don't
00:09:23
know legitimate, but it's like a way that you can you can, you
00:09:27
know, get to the hundreds of billions of dollars that he's
00:09:29
talking about to to build. I guess it's just AGI.
00:09:33
That's just basically the play. Well.
00:09:34
That's the that's the pitch, right?
00:09:36
Like we'll hit AGI and then all of this will be solved.
00:09:39
And it kind of feels, you know, like we're we're floating and
00:09:41
waiting for this milestone of AGI to, you know, fix, you know,
00:09:47
all of this from unraveling or changing, you know, that's and
00:09:50
that's, and also once we hit AGI, everything will be great
00:09:52
and it won't really matter as much.
00:09:54
Right, right. We're talking about, you know,
00:09:56
we have the skeleton key to unlock trillions of dollars in
00:10:00
transformations across the economy.
00:10:01
So this is all a bargain and and you're getting in on the ground
00:10:06
floor by, you know, investing at a stock that is suddenly got a
00:10:10
wildly inflated PE ratio anyway. I mean, you want to move on to
00:10:17
kind of the the other side of it.
00:10:18
So we have like hype in terms of stock market appreciation and
00:10:24
deals based on future returns. And then we have the hype based
00:10:27
on like brand and the, I don't know, emotional attachment that
00:10:33
certain people have to companies and, and, and large language
00:10:36
models. Like I'll be honest, I've been
00:10:38
very buried in a bunch of other stuff.
00:10:39
Like what is going on with the Claude pop up?
00:10:42
First of all, for a little context, all of the major AI
00:10:45
labs and coding assistant companies have discovered
00:10:49
basically brand activations that that was a marketing tactic that
00:10:53
was foreign to them until last week.
00:10:55
And then all of them kind of descended and realized that the
00:10:58
New York influencer ecosystem is actually a great place to get
00:11:02
your products up and running in the tech sphere outside of San
00:11:04
Francisco. So last week, Anthropic hosted a
00:11:09
limited pop up for Claude at a coffee shop owned by the former
00:11:14
Vanity Fair editor. It's kind of a sceney place,
00:11:17
shop place in Soho. And people were waiting in line
00:11:22
around the block to pick up Anthropic branded hats and
00:11:26
notebooks and get their be seen and be seen at their their brand
00:11:31
activated pop up. It was definitely a huge
00:11:32
success. But I actually, if you'll, if
00:11:35
you'll excuse me, I'll go grab the hat.
00:11:37
I'll have to run this for a SEC. So you actually went off and did
00:11:39
this? I did.
00:11:40
I did manage to wait in the line and secure a hat.
00:11:45
While you're doing that, I can regale the audience with another
00:11:49
tale from the Snapchat reporting past that maybe only I will care
00:11:52
about. But this is all very reminiscent
00:11:54
to me of Snapchat Spectacles that came out, I think in like
00:11:59
2017 or so. This was Evan Spiegel's big play
00:12:02
to create camera glasses. And they were very cool.
00:12:05
And it was from a vending machine.
00:12:07
And they, I think, had one and they had one.
00:12:09
And there it is. There's the hat.
00:12:10
Oh yes, if you're watching on YouTube, you can see the hat
00:12:12
here. It is a it is a thinking cap,
00:12:15
which I thought was actually pretty clever and it's part of
00:12:18
anthropic. Sorry to cut you off Tom, but.
00:12:20
Yeah, no, people got the point. Basically, this has been tried
00:12:23
in the past by tech companies in New York playing into like the
00:12:26
desperate, hypey nature of people that hang around the New
00:12:29
York scene and love St. wear. Tell me about getting this hat,
00:12:33
Madeline. People, I have to to say people
00:12:36
are very excited to get the hat. I think it's quite clever
00:12:39
thinking cap. It's part of their broader
00:12:41
thinking campaign because Anthropic is sort of positioning
00:12:44
itself right now as the Apple think different almost, you
00:12:50
know, fun, sleek, branded, you know, they have all the clawed
00:12:54
merch with it's nice design. The logos on the different
00:12:58
pieces were clearly done by some thoughtful designer then rather
00:13:02
than, you know, just standard corporate logo on something.
00:13:04
And it has more of an illustrated touch, I have to
00:13:07
say. The hats to plug our own
00:13:09
conferences, they look quite similar to our Deus Ex Medicina
00:13:13
hats. So they got the memo that
00:13:15
lowercase font Simple is really in right now.
00:13:18
Lowercase fonts, yeah. We were ahead of the curve on
00:13:23
that one. So let the record show.
00:13:25
But I will say they are positioning themselves in this
00:13:29
way that I think in some ways, given, you know, opening eyes
00:13:34
deeper partnership with Microsoft, it almost seems like
00:13:38
there's this kind of foil of like the Mac versus PC campaign
00:13:41
that's like we're the fun creative model that's actually
00:13:47
making you think and avoid slop and not building a social media
00:13:52
platform yet. And I think those associations
00:13:57
are getting placed in with Anthropic's brand and the, the
00:14:01
actual pop up structure. I know Cursor is hosting one
00:14:05
later in October in New York as well for the Cursor Cafe.
00:14:08
So you can pop up and get merchant coffee as well and code
00:14:11
live with all your friends using Cursor.
00:14:13
But it don't think it has quite the same appeal as, you know,
00:14:17
the, the smart brand in addition to the free hat, you know, with,
00:14:20
with the thinking and kind of how they're positioning
00:14:23
themselves in the marketplace. So we'll see how it lands in
00:14:26
terms of actually acquiring customers, but I think it was
00:14:30
smart. What describe to me the type of
00:14:32
person that was waiting in line to get a hat?
00:14:34
Like what was their scene? Who is their customer?
00:14:37
How would you describe like a tech hype beast?
00:14:40
I would say that's who is in line like.
00:14:42
So that's a real thing now. I mean, these people have like
00:14:44
taken over San Francisco. Yes.
00:14:46
And now we have them in New York, too.
00:14:48
There were a couple people I know who at least from Twitter,
00:14:51
claimed to have flown from San Francisco to New York to acquire
00:14:54
the hats. So that is hype.
00:14:57
It's a bicoastal hype train here.
00:14:59
And I would say there were also a lot of people who were, I
00:15:03
would say maybe more in definitely developers, younger
00:15:09
crowds who are familiar with all these tools, but some
00:15:13
influencers from just the general creative sphere of New
00:15:16
York coming in to check out what is the lovely brand activation
00:15:20
and get the hat. So are they reaching an audience
00:15:23
that they necessarily want? Maybe there's people who maybe
00:15:25
wouldn't have used Anthropic otherwise that are showing up at
00:15:27
this. Yeah, I I don't know what to
00:15:30
make of this personality type. This is in one sense something
00:15:33
that's been around for a while. Like it's the hype beast.
00:15:36
It is the person like camping out outside of the store getting
00:15:39
the oh shit. Like what's the shoe drop that
00:15:41
they always do? I think that.
00:15:44
Is the shoe drop. Not Jordans, maybe not the and
00:15:47
ones. Well Supreme would do all of the
00:15:49
drops like. Supreme would do the drops.
00:15:51
Yeah. I mean, look, there's no
00:15:52
question that is like a monetizable personality.
00:15:55
I don't know if that is that the clawed audience, like are those
00:15:58
people coders? Like the thing that gets me
00:16:00
about this strategy is like you mentioned the Mac versus PC
00:16:04
analogy, like Macs was like the creative tool.
00:16:08
It was the personal tool. It was like the, you know, it's
00:16:11
you. Everyone should have a Mac in
00:16:12
their home because it shows how independent of a thinker that
00:16:16
they are. And also they're like, you know,
00:16:19
standing apart from like, you know, the John Hodgman I'm APC
00:16:22
crowd. But open AI is the consumer
00:16:26
product. Open AI is, you know, Apache.
00:16:28
BT is a consumer product like Anthropic is the API company.
00:16:32
They're the API enterprise company with encoding assistant.
00:16:36
I know. So in some ways it's a weird
00:16:39
consumer play, focusing on how good you are at enterprise and
00:16:45
thinking and using your brain business.
00:16:48
But it's it's with consumer language.
00:16:51
So maybe in some ways it could be to widen a broader audience
00:16:54
and eventually go for this. It certainly earned a lot of
00:16:57
goodwill. I just, I do agree with you.
00:17:00
I'm not sure if this is going to translate into like real money
00:17:05
for Anthropic other than just feeding out everyone else out as
00:17:08
being the C&E AI lab. Yeah, again, it's just like if
00:17:12
you know, the people like Dario, the CEO, Not to say that Sam
00:17:16
Altman is cool by any definition of the word, but Dario was less
00:17:20
cool than Sam. Dario was like the geeky, you
00:17:23
know, effective altruist dude who, you know, has like stuffed
00:17:27
animals in the office and sort of runs the company in a very
00:17:31
sort of warm fuzzy. We need to, well, there's also
00:17:33
the kind of we need to make sure we don't enable the destruction
00:17:36
of the world through AI mentality.
00:17:39
But I just don't think of Dario as like a hype beast or a scene
00:17:42
kid. I think they're trying to make
00:17:44
him a hype beast because at this event they were giving away
00:17:47
little booklets of machines of love and grace with this name on
00:17:50
them and his writing about the future and his vision.
00:17:54
So they're they're making it a product.
00:17:56
They're trying. They're trying to make fetch
00:17:57
happen. Yeah.
00:17:59
I mean, this is like clearly some marketing persons like, you
00:18:01
know, boardroom or, you know, whiteboard pitch that they were
00:18:05
like, I mean, why not just give it a shot?
00:18:08
Let's just see what happens. That's so I mean, like, is this
00:18:11
going to go on down the line here?
00:18:13
Like do do you think like other companies need to respond in
00:18:15
kind? Like are there other model mate?
00:18:17
Like does does Gemini need to like carve out their thing?
00:18:19
Like are they like for, I don't know, like sports fans?
00:18:23
Like are they like, is Gemini the like, you know, water sports
00:18:26
or something? Like I just don't like.
00:18:28
Does there need to be branding for all these other competitive
00:18:31
models? What is XA is too?
00:18:33
What is grok? I guess that's just like, no,
00:18:35
they're. Just anti woke Yeah, theirs is
00:18:37
just like the the conservative AII guess if that yeah thing.
00:18:42
Yeah, they have their own. The reactionary AI, yeah, they
00:18:46
have their own meet ups separately, but yeah, I don't
00:18:49
know it is, but we'll see how this plays.
00:18:51
I I found it very fun to attend and very smart.
00:18:54
And as you know, working for a newcomer, a big fan of live
00:18:57
events. I love a live activation.
00:19:00
I'm not sure how it's going to play ultimately other than just
00:19:03
getting a bigger brand awareness and being known as something fun
00:19:07
and cool compared to actually generating a ton of revenue.
00:19:10
You were mentioning like the New York influencer scene, which I
00:19:14
observe only through TikTok. A lot of those people are paid
00:19:18
to appear. Did you get the sense that they
00:19:20
had like kind of shelled out a bit to get certain people to
00:19:24
show up? The day I went, no, I think
00:19:27
there was just genuine interest for people coming.
00:19:30
There were people who were paid kind of almost like, you know,
00:19:33
like people on the street, like brand people that were handing
00:19:37
out these, these clawed gift giveaways.
00:19:40
But mostly it was that rather than, you know, maybe some shiny
00:19:43
influencers showing up at this event.
00:19:45
So I do think they did have some organic growth, but they didn't.
00:19:48
I won't say that there were like people with millions of
00:19:50
followers from TikTok clamoring to get into the cloud coffee
00:19:54
shop. It was much more people in the
00:19:56
tech ecosystem. If they had flown out like
00:19:58
Aiella or one of these, you know, net cap girl, Twitter
00:20:01
people, then maybe we could talk.
00:20:02
But I think the influencer models are a little different.
00:20:05
Or like the New York like food influencer scene, the guys that
00:20:08
are all just showing up at like every new like soup dumpling
00:20:11
place in Saint Mark's or the ones here to come with me and
00:20:15
you can try the best pancakes. The Golden Diner like that.
00:20:18
That wasn't that wasn't the model, right?
00:20:21
They weren't getting those. People, they didn't, they didn't
00:20:23
have the VIP list. Like insulting you if you didn't
00:20:26
come down to the clawed coffee shop.
00:20:28
Right. Yeah.
00:20:29
There weren't like, people. What's the the influencer
00:20:31
scandal that always happens where, like, all these
00:20:34
influencers start, like, calling out these various restaurants
00:20:36
after making, like, completely unrealistic demands of them.
00:20:39
It's like, I want to bring three of my friends here and I want
00:20:42
like, a fully comped dinner and the restaurant's like, no.
00:20:44
And then the influencer, like, done, like puts them on blast.
00:20:47
Yeah, yeah, they didn't do that for Claude.
00:20:49
They did not do that for Quad Quad Cafe.
00:20:51
You're safe another day and we'll see if they do more
00:20:54
activations. It seems like it was definitely
00:20:56
a big hit for New York people. I also think for, you know, New
00:20:59
York second string here, we like when tech events come to us.
00:21:03
So I think there was a lot of demand of the New York tech
00:21:06
people just excited to go for something like this.
00:21:08
We don't, we don't have like a, you know, a tech activation on
00:21:11
every block these days. Yeah, well, we're, you know,
00:21:15
we're a week away from Dreamforce in San Francisco,
00:21:18
which is the ultimate tech activation wannabe try hard
00:21:22
crew. It's basically like Santa Con
00:21:24
for people who like don't really understand what their job is.
00:21:27
That's the Dreamforced model you.
00:21:29
Should you should stock outside and get a hat or some merch or
00:21:32
something? Oh, there's all kinds of merch.
00:21:34
They're very, they're very like National Park oriented.
00:21:37
It's very like, it's like kind of Ranger Rick type thing with
00:21:41
Dreamforce. Yeah.
00:21:43
It'll actually be interesting to see.
00:21:44
Yeah, it is fun. Sure.
00:21:46
It'll be interesting to see if like the models, you know, if
00:21:49
any of the LLMS try to like set up similar activation because
00:21:52
like they need, you know, they want play within.
00:21:55
I mean, it's a ton of CMOS and and CIOs showing up at
00:21:59
Dreamforce. It would be the perfect time for
00:22:01
them to to hawk this thing. If you see a a range of AI lab
00:22:06
coffee shops with sweet merch popping up around you, let me
00:22:10
know. I will.
00:22:11
I'll get the hat. You'll get the hat.
00:22:13
Yeah. What do you think about the
00:22:14
cursor one too? Because they are, there are like
00:22:17
obviously a, a, a handful of Gen.
00:22:20
AI coding tools out there, but it does seem like cursor has the
00:22:24
most brand recognition, like it's the one that in the space
00:22:28
people like maybe have the most association with or like
00:22:30
emotional attachment to. I'd say so, yes.
00:22:33
From enterprise customers, it's, I would say it's, it's kind of a
00:22:37
battle between cloud code and cursor in terms of which ones
00:22:40
people favor the most right now. So I think they have an audience
00:22:45
of, you know, rabid coding fans that will be excited to play
00:22:50
influencer in New York in a couple weeks for sure.
00:22:53
But I to your point, I don't know if they're the one that
00:22:56
needs this brand activation. Like where's where's Windsurf?
00:22:59
You know, like doing this pop up.
00:23:01
Where are they going next? Well, Tom, I know you've been
00:23:04
digging a lot into Hollywood and AI, and obviously we've been
00:23:08
playing around with the Sora 2 app quite a bit.
00:23:12
Is it dead? By the way, before we go into
00:23:14
this next topic, like, you know, really slop maven Katie
00:23:19
Natopolis declared on X that Sora is over.
00:23:23
What a title to retain slop maven that's I.
00:23:26
Hope it makes it around. I hope she hears this.
00:23:28
She loves this stuff. I mean, she's really one of the
00:23:30
best people that like on a shit posting out there.
00:23:33
I I have nothing but respect for for Katie Natopolis.
00:23:36
And so when she declares something and has weight, I, I
00:23:39
care when she has sort of a declaration on the state of a
00:23:43
thing that was popular. And she says Sora 2 is over,
00:23:47
that she can't get people to use her sign up codes anymore.
00:23:50
And you know that the all the big influencers or whatever that
00:23:55
existed have fled the platform. And it's, it's moment, brief as
00:23:59
it was, is, is over. But what do you what do you
00:24:01
think? She's right in, at least my
00:24:04
understanding. Anecdotally, I can't get any of
00:24:06
my friends to use my codes that you gave me with your invite, so
00:24:11
I've still got 2 left. If anyone wants 1.
00:24:14
I think there's some. We'll see.
00:24:16
I think there's some reach out Novel.
00:24:18
Yeah, if you know Madeline at newcomer.co if you want a code.
00:24:22
Prove me wrong. Do you want my code?
00:24:23
Sora's still going? No, I think it's kind of
00:24:28
suffering what always like a lot of flashy in a flash, in a pan
00:24:32
consumer features do, right? Like once the novelty wears off,
00:24:37
like how do you retain users to actually stay?
00:24:39
And I thought that they had an edge, as we talked about with
00:24:41
Alex last week, around, you know, having your friends
00:24:43
participate so that it's a bit more social.
00:24:46
But if people aren't really chatting or networking beyond
00:24:50
just posting the videos and the feed, do people want to stick
00:24:53
around? But it did, you know, drive a
00:24:55
ton of sign ups. So that was an effective play if
00:24:58
you wanted to get a lot of people to sign up.
00:25:00
Man, when Alex was on and and you know, Alex is, is great and,
00:25:03
and we hope to have him on a ton more, but I disagreed with him
00:25:06
at the time that this was, you know, a a long term play for
00:25:11
open AI on Sora specifically. Like, I think it to me, it had
00:25:15
all the makings of the Studio Ghibli moment or any of the face
00:25:19
filters that had popped up on Snapchat or you know, I'm not
00:25:24
huge on Tiktok. I mean, my accounts really is
00:25:27
not huge on Tiktok, but as a user, I don't spend a ton of
00:25:29
time on there. But they have little memes,
00:25:31
obviously, that that, you know, get big from time to time,
00:25:34
little games or filters or whatever on TikTok that sort of
00:25:39
like run the hype cycle and then and then disappear.
00:25:42
To me, this Sora AI video thing had all the makings of that.
00:25:47
And I think that is what it turned out to be like.
00:25:50
It was kind of like a, you know, a, a TikTok feature basically
00:25:55
that got its own app to its own credit, but effectively like,
00:25:59
you know, ran through in a very short period of time the
00:26:01
excitement and then sort of another toy that you put back on
00:26:05
the shelf. Yeah, I guess it could say if,
00:26:07
if the feed itself was the launch and that was the big
00:26:12
product, maybe it hasn't landed yet or is suffering.
00:26:15
But if the tool itself was to get people to sign up and test
00:26:19
their video features and get a lot more training data, then
00:26:22
afterwards, after getting all of that, then it was a resounding
00:26:25
success. So it's it's kind of like it's
00:26:28
hard to launch a feature as a standalone company, but they can
00:26:32
punch it within Open AI and still work and get a ton of
00:26:35
information out of that. Right, right.
00:26:37
It's yeah, it's always movable goal posts with it.
00:26:39
So, and it would be stupid to call Sora a failure by any
00:26:42
extent, but it does seem like the people who had dubbed this
00:26:46
like the like, Oh my God, they just made tick tock wildly
00:26:50
overshot, you know, the attention that this was gonna
00:26:54
get from people, I think. We all we all ran back to
00:26:57
TikTok. Yeah, well, a lot of these
00:26:59
videos made their way to Tik. T.O.K, yeah, they're also all
00:27:01
getting you. You can save them.
00:27:02
So they're getting posted on other platforms.
00:27:04
I've seen Sora videos everywhere on TikTok and readers, so
00:27:06
obviously that's sure promotion for open AI.
00:27:08
But people aren't sticking around on the Sora feed.
00:27:11
They're using Sora the app to make the videos and then sharing
00:27:14
them elsewhere. Right.
00:27:16
Which like I guess back to Open AI is why there's such an
00:27:18
interesting company because they truly are doing everything.
00:27:24
I mean, the number of products and strategies that they are
00:27:27
juggling at the same time is incredible.
00:27:30
And like it's like at the same time that they have basically
00:27:34
attempted maybe to build a TikTok and got to number one in
00:27:38
the App Store and got like a huge amount of viral buzz.
00:27:39
They're also becoming like the next hyper scaler and building,
00:27:44
you know, and they're also going to be like the next big coding
00:27:46
tool and they're also the next Google with their search tools.
00:27:49
And you know, it's, it's like not even really to Snapchat's
00:27:54
detriment that like sort of may have like peaked and fallen off.
00:27:57
It's just like another example of like what their ambitions can
00:28:00
be when they're like really dialed in.
00:28:03
Watch out Elon, Open AI is coming to be the everything app.
00:28:06
That's the everything app, I guess.
00:28:08
Well, and you saw Elon's response to this, right?
00:28:11
Like he's he's been pushing like the next Gen.
00:28:13
Grok AI image tool constantly. He's trying to catch up.
00:28:17
I mean they're they're out Elonning Elon right now.
00:28:20
Yeah, I happen to be in LA at around the time that Sora was
00:28:25
like peaking. And I've been doing this
00:28:28
reporting for this other story that I'm working on about
00:28:30
Hollywood and AI. And I was at this dinner with a
00:28:34
bunch of people from, you know, kind of traditional Hollywood
00:28:37
world, like the studios and also tech executives.
00:28:40
And it was kind of fun to ask around about Sora in this moment
00:28:45
that, you know, they had so aggressively wait.
00:28:50
Yeah, What did they say when you were there?
00:28:52
Well, I guess like, first of all, I had earlier that day
00:28:55
talked to Varun Shetty, who is the head of Media Partnerships
00:28:58
at Open AI. And he laid out to me, which we
00:29:02
later published in the newsletter, one of their, their,
00:29:06
their thinking around IP and why they decided to allow SORA
00:29:11
initially to kind of let people have a lot of looseness with
00:29:16
their ability to create likenesses of, you know, very
00:29:20
famous characters. Like I made something with Peter
00:29:23
Griffin. I did something with Kermit the
00:29:25
Frog. It wasn't that hard in the first
00:29:27
couple days to basically get these guys to do whatever.
00:29:30
And Varun's response to me was like 1, like, we want to give
00:29:35
people the freedom to create what they want.
00:29:37
And, you know, the more guard rails we put in, you know, the,
00:29:39
the more that's kind of hemmed in.
00:29:42
And then the other side was also everyone else is doing it
00:29:45
basically like you didn't want to be at a competitive
00:29:48
disadvantage. And I think they got a lot of
00:29:52
heat for saying that. And later that day, Sam put out
00:29:57
a blog post saying, like, oh, we're switching our policy and
00:30:00
we've moved away from, you know, opt out to opt in.
00:30:04
And basically that whole period where, you know, it was the Wild
00:30:07
West of IP violation became not that and things, you know,
00:30:12
really got hemmed in. And so that was interesting on
00:30:15
its own. And maybe that killed a little
00:30:17
bit of the sore engagement too, because people liked the novelty
00:30:20
of making Rick and Morty trail Cam footage.
00:30:23
Yes, 100%. I I think that probably would
00:30:27
have worn off too, but there's no question that like it
00:30:30
definitely slowed the momentum of the whole thing.
00:30:34
But at this dinner, you know, I brought up, you know, Varun's
00:30:39
comments and the response was like, you would only say
00:30:42
something like that this being Varun or open AI in general, if
00:30:45
you didn't really care about Hollywood, if you didn't really
00:30:48
view them as a potential client for your technology.
00:30:54
And you know, it's been interesting to see the evolution
00:30:58
with open AI and the movie industry or the entertainment
00:31:01
industry, even just in the last year, because a year ago when
00:31:04
they were doing Sora V1 Sora, they were taking lots of
00:31:08
meetings with all the studios and basically trying to show
00:31:11
them that they didn't bite. And you know, this is a very
00:31:13
useful tool. You should work in your
00:31:14
workflow. And we're here to be a friend.
00:31:17
And almost. Like the runway model, but but
00:31:19
they're a version, right? Yeah, exactly.
00:31:21
I mean, runway's interesting because they actually have
00:31:24
signed a major partnership with Lionsgate.
00:31:26
And so they have they have kind of embedded themselves as
00:31:29
closely, well, relatively closely with with the
00:31:32
entertainment industry. Open the eye is still this kind
00:31:34
of foreign body. So, so the perspective was like
00:31:37
from SORA V1 to this one was a pretty stark change in their
00:31:42
views on the industry. And it's one thing to like go to
00:31:46
the meetings with people as they were at like a year ago and say,
00:31:50
oh, we have this tool, you should use it.
00:31:52
It can make your lives all better.
00:31:53
And, you know, we're here to help, which whereas this one was
00:31:55
like a fait accompli, we're barreling in.
00:31:59
We'll give you the option to opt out.
00:32:00
But otherwise, don't be surprised if you see, you know,
00:32:03
Pikachu on the grill. And it's, you know, it's that
00:32:11
that's just a choice. Like Hollywood is sensitive.
00:32:14
You know, people really take this stuff very seriously that,
00:32:16
you know, the town is run by lawyers and when you go about it
00:32:20
in the way that Open AI did, they certainly sacrificed a
00:32:23
relationship with the industry in exchange for virality.
00:32:26
And I think that was the biggest take away from these people.
00:32:29
Maybe Open AI was just taking the calculus that they don't
00:32:33
need the Hollywood industry to do what they're doing.
00:32:36
They exist as the Everything app without it.
00:32:40
Right. Well, there's basically two ways
00:32:41
into it, right? It's either like they've decided
00:32:44
that there's way more market share and money and attention to
00:32:47
be given to short form and basically, you know, living in
00:32:50
the AI slop TikTok. TikTok.
00:32:54
Yeah, TikTok apocalypse or slop apocalypse?
00:32:59
Slop apocalypse. Yeah, there's more, you know,
00:33:02
more value to being there than in trying to like ingratiate
00:33:05
yourself with Hollywood. Like, that's one choice.
00:33:08
The other choice is maybe they're like, well, we'll just
00:33:10
get so big that we'll be inevitable that we'll have such
00:33:13
a good tool that it doesn't matter if you hate our posture
00:33:17
because we're just, you know, unavoidable.
00:33:21
And yeah, I don't know what do you think?
00:33:22
Like like where where does where does opening eye go in terms of
00:33:27
their. I mean, you're, you're, you're a
00:33:28
movie fan like you, you know, you care a lot about, you know,
00:33:31
the, the, the future of. Madeline Renberger, noted movie
00:33:36
enthusiast on this podcast. Yeah.
00:33:39
I mean, I think if they're choosing to kind of not go the
00:33:45
way of Hollywood as they are, I think these other apps that have
00:33:51
market share that are sort of tools like a runway or video
00:33:54
editors or animation tools that appeal more specifically to the
00:33:59
industrial creative market rather than the social media
00:34:02
creator market have enough of a market share to fill that gap.
00:34:07
And they, I kind of think they don't necessarily need it.
00:34:10
They can play in social media world, They don't need to play
00:34:12
in Hollywood world. And it's not.
00:34:15
I mean, I love the movies, but they're not as big of a market
00:34:19
as what their ambitions are. You know, it's not the industry
00:34:22
it once was. So why do they?
00:34:24
Totally. They don't need it.
00:34:26
They don't right. It's that's exactly right.
00:34:28
They don't need it. It's for there is obviously a
00:34:29
market there, but it's, you know, largely for kind of
00:34:32
artistic, ethical, emotional reasons at this point to be
00:34:36
supporting. Recording rather like to be
00:34:39
known as the friend of the artists and get better, you know
00:34:42
cred with the normies about that.
00:34:44
It doesn't have any like business value.
00:34:47
Right. And it's interesting as an
00:34:49
evolution just within the tech world because 5 to 10 years ago,
00:34:54
it was extremely important for a lot of up and coming startups to
00:34:57
announce that they had gotten a famous actor on their cap table.
00:35:02
You know, it's like we all remember, you know, oh, Robert
00:35:05
Downey Junior has backed Maker Studios or Ashton Kutcher.
00:35:11
I mean, he's probably a more legit investor.
00:35:13
Yeah, he's a more, you know, legitimized investor as much as
00:35:17
any VC is a legit investor. But yeah, but but yeah, like an
00:35:23
actor was, it was just kind of part of the thing.
00:35:25
Like in your A round or B round, you gave some allocation to to
00:35:29
an actor and you don't see that anymore.
00:35:32
And I don't know if that is because the nature of being a
00:35:35
celebrity is just different and, and Hollywood doesn't have the
00:35:39
same control over culture as it did, you know, 5-10 years ago
00:35:43
or, I don't know, it just never worked that much in the 1st
00:35:47
place, so people just moved on from it.
00:35:48
But this feels like a part of that same kind of trajectory of
00:35:52
like Silicon Valley just not needing Hollywood one way or the
00:35:54
other. Anyway, I don't know.
00:35:56
I had a good time in Los Angeles.
00:35:57
More to come on that front, by the way, so.
00:35:59
Stay tuned. Subscribe to Newcomer to hear
00:36:01
more of Tom's thoughts on Los Angeles.
00:36:03
Yes, and and Hollywood. Well, anyway to to hype beast
00:36:07
Madeline Renbarger signing off with their thinking cap.
00:36:11
Yes, got to flex the thinking cap once again like and
00:36:15
subscribe. Thanks so much.
00:36:17
That's our show. Thank you for tuning into this
00:36:19
weeks episode of the podcast. If you're new here, please like
00:36:22
and subscribe. It really helps out the channel.
00:36:24
Listen to new episodes every week wherever you get your
00:36:26
podcasts.
