Shaun Maguire's Growth Hack
Newcomer PodJuly 11, 202500:42:2738.88 MB

Shaun Maguire's Growth Hack

The Newcomer Podcast returns just in time to have Eric, Tom, and Madeline weigh in on Meta's audacious AI hiring spree. The tech giant has enticed researchers from OpenAI, Google DeepMind, and Anthropic with huge paydays in the hopes it can bring its Llama models up to par with the competition. Next up, Ramp's report that companies have stopped purchasing AI tools made a lot of buzz this week, but it's still too early to call an AI peak.

Later on in the episode, Grok's offensive replies aren't enough to slow down xAI's latest model launch. We close out the episode rehashing Sequoia partner Shaun Maguire's latest inflammatory tweets over New York mayoral candidate Zohran Mamdani.


00:35 — Meta poaches top researchers to ail its flailing models

11:13 — XAI outperforms despite Grok's offensive replies

18:27 — It's too early to call the AI bubble

28:50 — Shaun Maguire's tweets bring attention to Sequoia



00:00:00
Hey everybody, I'm back. We're back.

00:00:02
Eric, Tom, Madeline, we're all back on the Newcomer Podcast.

00:00:07
I've been here, I've been here. Yeah, So the show that you may

00:00:11
have liked or may not have liked is now returned.

00:00:13
Listen, I think we got a lot to go through.

00:00:16
It's been a lot happening in AI. And I understand this is not the

00:00:19
Cerebral Valley podcast specifically, but money is

00:00:22
flying. Money is flying here in tech.

00:00:25
If you are an AI researcher, you should be making at least 8

00:00:28
figures. You should probably start

00:00:30
claiming that you're an AI researcher to just see like what

00:00:33
the offers look like. Where do we even start?

00:00:35
Yeah, Madeline, you want to give us a rundown?

00:00:37
I mean, you've been digging into, I feel like there are

00:00:40
tables on tables of different Meta employees making a bunch of

00:00:44
money. We've tried to make the table to

00:00:46
end all tables, just pulling them all together with What's

00:00:49
your read right now on Mark Zuckerberg's, uh, recruiting

00:00:53
frenzy spending bonanza here? Right.

00:00:56
Well, 8 figure comp packages from Zuckerberg are the entry

00:01:00
level some people are getting, you know, 9 figure comp packages

00:01:02
out of this hiring spree. But it really kicked off in

00:01:05
earnest last month with Meta's Aqua hire of Alexander Wang from

00:01:10
Scale AI to lead the new Meta Super Intelligence Lab team

00:01:14
within the company. Meta paid over 14 billion for a

00:01:17
14% stake in Scale just to get Alex over on the team.

00:01:22
But since then, he's also poached Nat Friedman and Daniel

00:01:25
Gross there and is, you know, buying out a stake in their

00:01:28
investment fund to have them come work for.

00:01:30
So really just unprecedented amounts of capital wrangling to

00:01:34
get the biggest names in AI to lead the team.

00:01:37
And they've got, you know, obviously they've been picking

00:01:40
open a is pocket. They've got more than a half

00:01:43
dozen people from Open AI. They've got people from

00:01:45
DeepMind, Anthropic, Apple. As of this week, as of Thursday

00:01:49
recording, they've poached at least 10 people from Open AI to

00:01:52
join this team, in addition to those other big names.

00:01:55
As a side point to all of this, I mean, you know, a lot of the

00:01:58
narrative really kicked off, you know, when, like you said, the

00:02:01
scale deal happened. But then Sam, I thought really

00:02:04
unstrategically goes on I think his brothers podcast.

00:02:08
And he's just like, yeah, they're coming at us, but they

00:02:10
haven't gotten anyone good yet, which is a stupid thing to say.

00:02:14
He's like getting them on. On all on all fronts, it's we

00:02:17
have one clearly Zach, one of the most psychotically

00:02:20
competitive people in the history of, you know,

00:02:22
civilization saying like you can't get our people is like

00:02:25
just red meat for him. But also like you're basically

00:02:28
saying everyone that they did get was shitty and they did end

00:02:31
up getting shitty people. And like I understand like Sam

00:02:34
is still figuring out how to like measure his.

00:02:36
Like I'm a deep thinking AI guy and I'm very calm with the fact

00:02:40
that he also is psychotically competitive and doesn't like the

00:02:43
fact that Zuck is coming at. The the Philosopher king posture

00:02:46
with the cutthroat business leader of our AI future. 11

00:02:50
could also say missionary versus mercenary in that dichotomy

00:02:53
there. It's not just open AI, to be

00:02:55
clear, they've poached a lot of deep mind researchers.

00:02:59
And then roaming Peng from Apple Bloomberg is now reporting that

00:03:03
they have poached with a 200 / 4 years total comp

00:03:07
package. So these are serious offers just

00:03:11
from across the table of AI research.

00:03:13
People should be cheering for this.

00:03:14
This is my my first take I think.

00:03:16
You know gone are the old days when Apple and Google had anti

00:03:20
competitive alliances to artificially repress salaries.

00:03:24
I mean, these, this is the working class these these are

00:03:28
workers getting paid a lot of money for producing or Tom, I'm

00:03:33
curious what you take that like it's like.

00:03:35
Yeah, this is ultimately the proletariat winning.

00:03:39
Yeah, sure. The white collar proletariat,

00:03:41
that they're all basically creating technology that will

00:03:43
one day make white collar jobs no longer viable.

00:03:46
Yeah, you're right. There's a certain logic.

00:03:47
If you're the last man to ever work, it's like, yeah, you take

00:03:50
the money, man. A lot.

00:03:51
Like, right, like this is generational wealth because your

00:03:53
children, if they're also white collar workers, will never have

00:03:56
jobs. Yeah, I don't know.

00:03:58
I, I guess like we've been following the kind of turmoil at

00:04:02
meta on their AI team for as long as I've been with, with you

00:04:04
guys. I think it was like the first

00:04:06
story I wrote at newcomer was like, man, Llama is, is sucking

00:04:10
right now. Like they just have not done a

00:04:12
good job keeping up and DeepSeek, I think really fuck

00:04:15
things up for them. And so at some point, and there

00:04:18
were some good stories in the, you know, New York Times and

00:04:20
elsewhere about kind of like internally at meta what's been

00:04:22
going on, but suck. Just realized, like, either if

00:04:25
we're going to be playing at this, we need to use the only

00:04:27
asset that we really have here right now, which is like, you

00:04:30
know, endless, deep, deep, deep pockets.

00:04:32
It's just money. Yeah.

00:04:34
And that's always kind of Ben Zuck's thing, right?

00:04:36
You know, like the ideas side of him is not always sharp.

00:04:39
He he has money and he has network effects.

00:04:42
There's a lot of users, yeah. Yeah, yeah.

00:04:45
So like Facebook is, you know, not growing at all almost.

00:04:48
And so you buy Instagram and you and you get WhatsApp, you

00:04:50
basically just buy your way into like continued relevance as a

00:04:53
company. And like there's no specific

00:04:55
company to buy really an AI, an AI that's going to make you, you

00:04:59
know, top, you know, top tier. So just get the research because

00:05:01
it's really just talent. Well.

00:05:04
There's some, there's, there's a very, very limited pool of

00:05:07
researchers that can be building these models at this scale.

00:05:09
So it is really like a small pool of people that can be bid

00:05:14
on at the Super high comp packages.

00:05:16
Like there is just a limited pool of worker resources to go

00:05:19
around. You guys are a little deeper in

00:05:21
it than I am, but I have to imagine open AI's weird

00:05:25
shareholder structure is not helping it retain these

00:05:28
employees. Like, first of all, you have

00:05:30
this question mark around what these shares really mean and how

00:05:34
how the company sort of sorts out.

00:05:36
Where is it meta? You know, you're getting public,

00:05:38
public shares, you know what they're worth.

00:05:41
And then there have been all these secondaries that allowed

00:05:43
open AI people to get liquidity. And so it's like, whoa, I've

00:05:48
already gotten paid out from you, I can go on to the next

00:05:50
thing. Or what's your read on how

00:05:52
opening I set this thing up? Yeah, I think well the the this

00:05:55
was brings back to like Sam's, you know like hubristic comment

00:06:01
like you can't poach from U.S. Open AI has been losing people

00:06:04
constantly over it's life as a company.

00:06:06
This is not like a a company full of lifers.

00:06:08
There are. Some, I mean Anthropic exists as

00:06:10
a defection from open AI. The story of Open AI is a story

00:06:14
of people leaving open AI. Basically right?

00:06:16
Like like Elon, Dario, you know, Mira Ilia, like it's, it's, it's

00:06:22
an endless list of alumni that, that come there.

00:06:24
So this idea that these guys are like all missionaries who want

00:06:27
to stay there for life, it's just not true.

00:06:29
And it's a complete, I think they're the sympathetic argument

00:06:32
is that they're missionaries in that they want to be there when

00:06:35
AGI is bored. I think a lot of them do want to

00:06:37
be be in the place where it's born.

00:06:39
And if you can convince them like, hey, you know, our our

00:06:43
superstar team is moving from this place to this place, you'll

00:06:46
have another shot at it. Or you think this is about money

00:06:48
and not not about well, it is AGI.

00:06:51
Yeah. Well, I also like in reporting

00:06:52
out that piece that I did a couple weeks ago about basically

00:06:56
all of this and and Meta's recruitment, I was really struck

00:06:58
by 1 researcher I talked to who was like basically creating an

00:07:02
AI model as a recipe. And there are certain people who

00:07:04
have the recipe in their brain and the what you're paying for

00:07:07
when you poach someone from a labs to go to the new place so

00:07:09
they can recreate the recipe. And what's going to end up

00:07:12
happening is that like you have all the recipe knowers at the

00:07:15
different labs around the, the industry and they're probably

00:07:18
all going to reach AGI at the same time or very close to each

00:07:21
other. And so I think what my guess is

00:07:24
like the realization a lot of researchers had is like, there

00:07:26
isn't the place that it's going to happen.

00:07:28
It's all leafy. Now, and everybody's going to

00:07:29
have it, might as well get paid the most along the way, yeah?

00:07:32
So why not, why not get yours? Like, if that money's out there,

00:07:34
what are you going to be like a missionary and take less money

00:07:37
with like weirdly illiquid stock so that you can like maybe get

00:07:40
AGI two weeks before the next lab does?

00:07:43
Like, it doesn't really compute. And so I think that more than

00:07:46
anything, which is man. We need to build a business just

00:07:49
for these 100 million employee like whatever they want to read

00:07:52
and then just the advertising budget on that, like get in

00:07:54
front of like the ludantic. Just like just people pays.

00:07:59
Like a ten person e-mail, you know, they're like hedge fund

00:08:01
newsletters we'll have just like the AI researcher, you know,

00:08:04
like 20 person. How much would Zuck pay us so

00:08:08
that we could have like, you know, ads in front of like the

00:08:10
10 most important research? Exactly.

00:08:12
Just like on their minds. Yeah, so like that, that's,

00:08:15
that's all been very funny to watch.

00:08:17
Yeah, play out. And then you know, like we'll

00:08:19
see how it all happens with meta, right?

00:08:21
Like they have the teams in place and I'm interested to see

00:08:24
how long it takes for them to like redo the llama training to

00:08:28
get to a point where this was all worth.

00:08:30
At least they'll get a model that's competitive because

00:08:32
that's the problem. I mean, making money off of it

00:08:33
is a whole different story with with that company.

00:08:36
They're open source. They've never really made a dime

00:08:38
off of it. Like this is going to be a long

00:08:40
term play for for for meta to make it work.

00:08:43
But in the meantime, all these guys get rich.

00:08:46
So what does this all mean for Llama?

00:08:48
Obviously Llama 4 was sort of a disaster.

00:08:50
Do you think Meta sticks with its open source strategy of just

00:08:55
making no money and foiling the competition?

00:08:58
Or you think, now spend all this money and better come up with a

00:09:01
a proprietary product for itself?

00:09:03
I hope they do. I hope they keep with the open

00:09:05
source thing because to me if Zuck's whole play is about

00:09:08
fucking over Sam just. Burn the forest down.

00:09:11
That would be the best way to do it, right?

00:09:13
And I always thought that was the smartest aspect that he had

00:09:15
there in the 1st places, like looking, because he's very

00:09:17
jealous, I'm assuming, of Sam. And as he was coming up to being

00:09:20
like the voice of Silicon Valley in this moment that he's like,

00:09:23
oh, well, what if we just take the same technology but release

00:09:25
it for free? That'll destroy these people.

00:09:27
They'll have no business, which I thought was cool.

00:09:30
I actually thought that was like an actually badass play.

00:09:32
And if he decides to get rid of it because he just wants to make

00:09:35
money in some consumer product, I will be really disappointed.

00:09:38
There is a very subtle monetization mechanic on this,

00:09:41
which I saw YouTube said that people who dump like AI stuff

00:09:47
onto YouTube will not be allowed to monetize it.

00:09:49
And there's certain argument that like Google and Facebook,

00:09:53
they could be the content providers.

00:09:56
You know, it's like if, if there's this AI appealing

00:09:59
content, it's like you block everyone else from it and you

00:10:01
are sort of the exclusive content provider.

00:10:04
I don't, I don't know that's. I haven't heard that.

00:10:06
That's interesting. I mean, The funny thing with

00:10:09
Facebook and AI or Meta and AI is that, you know, like when

00:10:13
stories would come out that like Llama 4 Behemoth is delayed,

00:10:16
their stock drops. And I didn't ever really

00:10:18
understood that because they don't make any money off of it.

00:10:20
But there's just this general acceptance on the part of

00:10:23
investors that like, well, AI has to matter for everyone.

00:10:26
So it's important that you like have a, you know, a.

00:10:28
And there is like, you know, open source projects if you're,

00:10:31
you know, if you're around them, usually you know, people have

00:10:34
figured out how to make money. Sure, it's a distribution play

00:10:37
like that's what it is with Google and Chrome and and and

00:10:39
Android. Did I say Chrome?

00:10:41
I mean, Android with Google and Android, so yeah, sure, I don't

00:10:44
worry too much about like their ability to make money off of it.

00:10:47
But for this in the short term, having this, you know, open

00:10:50
source high level model, like like we saw DeepSeek really

00:10:54
through the industry in a turmoil for at least a little

00:10:56
bit. You know, you think Meta could

00:10:58
do that at a much larger scale. So it's kind of exciting.

00:11:01
I'm betting on closed source to be clear.

00:11:03
I mean, that's the smarter bet, but also the two biggest

00:11:06
companies that are succeeding on it right now are closed source,

00:11:09
right, and Tropic and Open AI and then I guess Google as well.

00:11:12
So yeah, that's a. That's a safe bet.

00:11:14
Well, moving on, outside of open source or closed source, you

00:11:16
probably don't want your model of becoming Mecca Hitler.

00:11:19
So that was the other big story, political.

00:11:22
Orientation. These days, that's.

00:11:23
True, I don't know. If it happens, it happens, yeah.

00:11:28
Unfortunately, we probably don't want the API to be Mecca Hitler

00:11:30
on this podcast. But yes, we're talking of course

00:11:33
about Grok 4. XA is new model that was debuted

00:11:38
by Elon Musk earlier this week, which was overshadowed by its

00:11:41
Grok chatbot interface on X. Just ramping up the racism and

00:11:45
politically incorrect speech in all of its replies to 11.

00:11:48
So you can't. Really be like, oh, we know what

00:11:50
that last name means, you know, just like it's it's so.

00:11:55
Bad. I couldn't tell if it was real,

00:11:58
right? Did you see the tweet or it was

00:12:01
like a screenshot of one that was like Grok 4 was getting

00:12:04
really explicit about Linda Yak or Reno.

00:12:07
Did you see that? One I I saw something about Will

00:12:09
Stansel. Yeah.

00:12:11
How did Will Stancil get inserted?

00:12:13
That was true. Oh, no.

00:12:14
Will Stancil is like the lead anecdote to the journal story

00:12:17
about this, which I think is a little ridiculous.

00:12:19
Like, you know, come on, It's like I.

00:12:20
Mean people are like walking. They're like convincing the

00:12:22
model to do this right. It's like they're doing like

00:12:25
thought experiments sort of stuff.

00:12:27
Yeah, it's like, don't, I don't know, I there's clearly an issue

00:12:31
in terms of it's like, you know, guidelines and and and how it's

00:12:35
allowed to interact with people. I don't honestly think Rock was

00:12:38
giving people specific ideas on how to like attack and rape Will

00:12:41
Stancil. And this was like a, you know,

00:12:43
he. Was a random Lib Twitter account

00:12:46
that's too deeply ingrained in our brains.

00:12:48
He also ran. Thanks to thanks to you and I

00:12:51
guess. Can I speak out in defense of

00:12:53
Mecca Hitler for one second? Sure.

00:12:56
Not a take I expected. I can't believe I'm going to say

00:12:59
this. I think X AI deserves a lot of

00:13:00
credit for what they've built within the model game.

00:13:04
They are a far smaller lab than almost all the other companies.

00:13:07
They basically were made-up by, you know, Igor Babushkin, who is

00:13:11
this ex DeepMind guy who came in there definitely on the, you

00:13:14
know, premise of wanting to build the anti woke open AI, the

00:13:17
anti woke, you know, woke chap GBT and they fucking did it.

00:13:21
Like Grok 3 is a really good model.

00:13:23
It it like it's absolutely competitive with all the other

00:13:26
big models. They built this giant database

00:13:28
in Nashville Colossus. I don't know yet how good Grok 4

00:13:32
is, but like according to the benchmarks, which are can be

00:13:34
which can be games, so you don't want to give too much credit to

00:13:37
it. And as we saw with with Llama 4,

00:13:40
you can do well on the benchmarks and actually have a

00:13:41
completely useless model. But like I look at Microsoft and

00:13:45
what they tried to do with Mustafa in hiring a top tier

00:13:47
researcher. They had tons of resources.

00:13:49
They have not put out a large model.

00:13:51
You look at Meta, they have completely fucked up their whole

00:13:53
thing. Grok is out there putting out

00:13:55
models that seem to be fairly competitive, if not better than

00:13:57
the rest that's out there. So like, I can't believe I'm

00:14:00
defending Elon in all of this, but like maybe you get a little

00:14:04
Mecca Hitler on the way to doing it.

00:14:06
It's like smart enough, who cares what it's?

00:14:10
I mean, maybe the smart, yeah, it's like, you know, we all have

00:14:14
friends like that, that sometimes they're really smart,

00:14:16
but sometimes they, you know, advocate Mecca Hitler ideology.

00:14:19
I just think about. Your your tanks are all over the

00:14:23
place, no? No, no, I mean, I I'm I'm

00:14:25
obviously joking about Mecca Hitler, but genuinely speaking,

00:14:29
like I think you know what, like Maddie, you said like it

00:14:31
overshadowed like the release of Grok 4, which is maybe true

00:14:34
because Mecca Hitler was very funny, maybe scary, if you're

00:14:36
will Stancil. And also inadvertently led to

00:14:39
Linda Yakarino stepping down, I think in some big parts.

00:14:42
So we. We don't know.

00:14:44
We don't know. I haven't read a good story

00:14:46
about it yet. Yeah, it could have.

00:14:48
It could have, I just, I don't know.

00:14:50
I, I, I don't even want to come off as contrarian here.

00:14:53
I just think like what Xai is doing is impressive.

00:14:55
Like in the world where like most labs do not put out very

00:14:59
good models, like they consistently or at least twice

00:15:01
in a row put out something that seemed to matter.

00:15:03
I mean, somebody was commenting that, you know, xai like their

00:15:06
their presentation was like the stuff of engineers.

00:15:09
It just feels like a company that's all engineers, no

00:15:12
philosopher kings, and I would prefer one of the philosopher

00:15:16
king companies win or xai get some scruples.

00:15:20
I mean, what is freedom in the context of a model?

00:15:24
You know, I mean, the model has like a point of view, you know

00:15:25
what I mean? What there isn't this sort of no

00:15:29
matter they're going to direct it towards one thing or away

00:15:32
from another. I'm interested in seeing how it

00:15:34
actually is implemented and used by people because if it's like

00:15:38
actually impressive, you know, people shouldn't just discount

00:15:42
all of that because, you know, it seemed to have some post

00:15:44
training issues. Right.

00:15:45
I mean, I don't think they will. I mean, first of all, Grok has

00:15:47
this huge advantage of being in front of everybody on X.

00:15:50
So if they have something good, it's it's got the users in a way

00:15:54
that people like Anthropic don't.

00:15:56
Yeah, they also have a great base of text based training data

00:16:00
with all of this, you know, real human text data.

00:16:03
Well, that's clearly where the Nazi shit came from.

00:16:05
I, I will do the classic reporter thing, which is I, I

00:16:08
wrote in a story. I forget if it was my bear case

00:16:12
in opening an eye or what, but I've been pretty bullish on Grok

00:16:15
for a while, despite, you know, being somewhat of an Elon.

00:16:18
Testing. I didn't know that.

00:16:19
What's your? What's your argument for that?

00:16:21
What? Was we're just like talking to

00:16:22
investors that they really thought like Grok was going to

00:16:25
catch up. And that like the internal signs

00:16:27
were that they had really good people and that just investors

00:16:30
were taking it very seriously in a way that, you know, they still

00:16:35
had to prove to the public. I mean, it's raised a ton of

00:16:37
money, you know, and I, I don't think the people making those

00:16:40
investments are idiots. You know, I think they thought

00:16:44
Elon will catch up, and it seems like he has.

00:16:47
It's always unclear with the Elon investments, right?

00:16:49
Like to a degree, it always feels like it's a little bit of

00:16:51
like, well, if we give Elon money for this maybe idiotic

00:16:54
project, we'll get access to space.

00:16:55
But Rock was so much the X AI stuff and Twitter, there's a lot

00:16:59
of money to do it. Also, I think those started to

00:17:03
need to pay off. Those were the ones I think, you

00:17:05
know, like Boring Company. Some of those you're like, oh,

00:17:07
OK, well, I'll throw them a bone.

00:17:09
Hope some of the other ones workout.

00:17:10
But like, I think X AI was one of the ones that needed to

00:17:13
workout. You know, it'll be interesting.

00:17:15
What financial machinations he pulls between Tesla and XAI?

00:17:19
Think he already said that Tesla would soon start having Grok in

00:17:25
the cars, so I'm sure there's going to be all sorts of self

00:17:28
dealing between these companies. Well, yeah, that's that's kind

00:17:32
of his thing. Which the investors want.

00:17:34
It's like, how do we make sure they all make money like?

00:17:37
Basically it's like, it's like, yeah, yeah, if you could end up,

00:17:40
if you were an ex, you know, Holder and you ended up with

00:17:43
XAI, that's a little bit better. But yeah, if you can somehow

00:17:46
spin this off into Tesla stock or or even better, SpaceX stock,

00:17:50
that would be nice. Well, that's one of the core

00:17:52
reporting questions. I, I think we didn't really ever

00:17:55
answer was like did X shareholders or XAI shareholders

00:18:00
get a better deal in that merger.

00:18:02
And I think the answer was like on metrics at the time, you

00:18:06
would have said obviously X shareholders, but based on what

00:18:10
you just said and what a lot of people think, like XAI has this

00:18:12
unlimited valuation potential. So you're getting a good deal

00:18:16
getting access to XAI. That said, they don't really

00:18:20
make all that much money, right? Now, right, it's all, I mean,

00:18:23
it's like, but yeah, with Elon, what is making money, you know,

00:18:25
it's like about it's, it's a valuation thing.

00:18:28
So while we're talking about unlimited valuation for for XAI,

00:18:32
guess you guys have been spending a lot of time thinking

00:18:34
about, and I was talking to our colleague Jonathan Weber about

00:18:37
this. Are we in a bubble?

00:18:40
Why now do you think we potentially are in a bubble?

00:18:42
Stocks have been going up for for some time.

00:18:45
Like what's what's particularly bubble like about what's

00:18:47
happening now? I think the biggest thing really

00:18:51
was just the RAMP report that the RAMPS research data put out,

00:18:55
which is of course based, it's a smaller sample size of people,

00:18:58
you know, using RAMP and what their customers are spending

00:19:01
money on. But I think there was a

00:19:03
noticeable kind of peak hit in the last month around tool

00:19:08
adoption. And while it's not a bubble, I

00:19:10
would say on AI, it could be a bubble on how much people are

00:19:14
willing to pay for AI tools, especially at the enterprise

00:19:16
level. So while that may not hurt AI

00:19:19
overall in the long run, all of these companies that are

00:19:21
building the next big tool for enterprise AI may have been

00:19:25
charging a bit too much. And I mean, we saw that, you

00:19:27
know, with cursor prices going way up and people complaining

00:19:30
and the getting subsequently reversed.

00:19:32
Like it could be just a point where not everyone is willing to

00:19:36
pay for infinite AI anymore if they think it'll get them ahead.

00:19:39
They may want to focus on cost at this point.

00:19:42
Let me take a stab at this. I mean, first of all, newcomer

00:19:45
is not declaring it's a bubble. I think we're pointing out that

00:19:48
there are a couple, an impressionistic take that there

00:19:51
are some things that might raise bubble questions. 1, is this

00:19:55
survey from Ramp showing that sort of spending on new AI apps

00:20:00
is going down, right? The number of apps I'm also

00:20:03
interested in sort of core Weave and the infrastructure companies

00:20:07
and the heavy valuations. That those companies have

00:20:11
received you know they've seen a lot of valuation growth and now

00:20:14
people are starting to ask questions about how they account

00:20:18
for things so so yeah I think they're they're cut there's the

00:20:21
sort of highly valued infrastructure players racing

00:20:24
into AI. There's this sort of ramp data

00:20:28
turn. So the, I think there are

00:20:31
questions about things. I think the, the key, the key

00:20:33
point here though, is like you can have like sort of a

00:20:36
correction in one sector, but not necessarily in AI.

00:20:40
Like I, I think the mood could turn against infrastructure

00:20:43
plays like.com era what there was this huge telecom build out

00:20:48
that was sort of correct, but they, they did it way too fast

00:20:52
and it wasn't that valuable. And so I think they're going to

00:20:55
be definitely nobody, nobody can test that there will be bubbles

00:20:58
within AI where some pieces of the stack are not that valuable,

00:21:03
but need to exist. People spend a lot of money to

00:21:06
build them and then it turns out they don't get to retain a lot

00:21:09
of the actual profits from AI. So it could be foundation model

00:21:13
companies, it could be infrastructure companies.

00:21:16
But certainly I think there are pieces of this where a lot of

00:21:19
money will be spent and not a lot of money will be sort of

00:21:22
retained by that category. And, and we don't know how

00:21:25
that's going to play out, but it's it's been to the benefit of

00:21:27
everybody who wants to build an AI.

00:21:29
I mean, say, we've also kind of seen this with sort of the

00:21:31
initial chat bot wave, right? Like there was a period of time

00:21:34
where there were all these start-ups coming out with chat

00:21:36
bots, which ultimately kind of just absorbed into a few players

00:21:39
and that wasn't actually monetizable.

00:21:41
And that wasn't the end of chat bots.

00:21:43
That just meant that that frame of how to make money off of AI

00:21:47
with chat bots wasn't going to be the way that everything

00:21:50
became as an individual company or application and had to be

00:21:53
blended into something like an open AI which could provide all

00:21:55
these other services. So I think there certainly will

00:21:58
be winners. It just may not be so stratified

00:22:00
across all these different products.

00:22:02
It's also, if you buy the argument that a lot of the bulls

00:22:04
have on AI that this is like a multi year, you know,

00:22:08
transformational cycle, we're gonna have dips in it, right?

00:22:12
It's just, it's just the nature of it.

00:22:13
Very bullish on AI while saying there's seems like maybe a

00:22:16
bubble right now. Yeah, I'm not even sure where I

00:22:19
fall in it necessarily, but I just don't think we're going to

00:22:21
be see like the peak bubble now. Like I think there's going to

00:22:24
be, it could be a couple years down the line where we see like,

00:22:27
oh, in fact, this did not turn out to be as lucrative as a lot

00:22:30
of the people thought it was going to be.

00:22:32
And the investments were well out of their skis.

00:22:34
I'm just not ready to call it now.

00:22:36
You know, I just haven't seen enough indication.

00:22:37
Catalyst beside we're over extrapolating from one ramp

00:22:42
survey otherwise, I mean, yeah. Yeah.

00:22:45
Well, there were things that were, well, there were things

00:22:47
that I think caused people to take notice, like, you know,

00:22:50
Microsoft pulling out and kind of rejiggering some of its

00:22:53
leases on data centres, which people for a time were deciding

00:22:56
that meant that they no longer wanted to build data centers,

00:22:58
which just isn't true, May not seem very specific to Microsoft

00:23:02
and its open AI relationship. But people look at those sort of

00:23:05
things and private equity companies that own a lot of the

00:23:08
data center companies may be changing their their leases.

00:23:10
So this has been going on for some time, but I just don't

00:23:13
think like, like you're saying, I just don't think we would have

00:23:15
seen it now. It felt very out of the blue.

00:23:17
Here's Here's what's sort of confusing about this situation.

00:23:21
OK. If NVIDIA produces high quality

00:23:26
AI chips at a faster pace like they say they're going to, if

00:23:29
they're going to refresh these chips, will that be catalyst to

00:23:35
a bubble bursting or will that be good for AI, right.

00:23:38
In some ways, if they update their chips more often, the

00:23:41
infrastructure companies which have spent a lot of money on

00:23:44
chips may be miscalculating how valuable that investment is and

00:23:49
may need to spend a lot of money to move on to the next one.

00:23:51
So, but obviously improving technology will be good for what

00:23:57
consumers experience in terms of AI, but it could disrupt some of

00:24:03
the infrastructure providers. So these are really sort of

00:24:06
complicated questions in terms of which companies get hit and

00:24:09
which companies do really well. Yeah, totally.

00:24:11
And I think you could see AI being as transformative as a lot

00:24:14
of the bulls expected to be. And NVIDIA takes a hit in in a

00:24:18
different way because there's more competition or because

00:24:21
there doesn't need to be, you know, the next generation of

00:24:24
chips being as expensive as they are, you have real breakthroughs

00:24:27
in terms of efficiency that makes it a better product for

00:24:30
people, more affordable for businesses.

00:24:32
But someone like it, NVIDIA gets its valuation just kind of

00:24:35
decimated. Not predicting that, but I'm

00:24:37
saying like, yeah. Both can be true.

00:24:39
I mean, I'm, I'm very bullish still on what's to be built in

00:24:42
AI in a lot of these start-ups. What we're hearing from people

00:24:45
like to me, I think there's, yeah, it remains my point of

00:24:50
view that there's a lot of capability available from the

00:24:55
existing models that people have not figured out how to turn into

00:24:58
products and people are going to turn those into products and

00:25:01
those products will make money. What is not clear to me is if

00:25:04
the foundation models get to keep return all the money they

00:25:08
spent to build up that capability, and whether the

00:25:11
infrastructure companies are properly valued based on

00:25:14
everything they've contributed to creating all that value.

00:25:17
Right. And then, you know, on the

00:25:19
enterprise side of things, like I still think we're not at a

00:25:24
point where businesses are seeing a great amount of benefit

00:25:27
from this technology. They fair indicator they just

00:25:30
haven't figured it out yet. Like I did this panel a month or

00:25:33
so ago with like all these chief data officers at fairly, you

00:25:36
know, old line companies, you know, like a Ford Credit Union

00:25:40
or Danone, like the yogurt company and they have the.

00:25:47
Earth business guy now. Well, I mean, I was invited to

00:25:50
do it. You know, I don't, I don't, I

00:25:52
didn't prior to this have a deep relationship with the yogurt

00:25:55
industry. But but, but like, you know,

00:25:58
they all have chief data officers.

00:25:59
They all want to figure this thing out.

00:26:00
But none of them, as I could tell, had really gotten like,

00:26:03
oh, we've implemented it in this way and now we've transformed

00:26:06
our business across the company to use this.

00:26:08
So they all want to do it. They don't really know how to.

00:26:11
And I think the agent buzzword, as exciting of an idea as it is,

00:26:17
I think we are far away from regular enterprises really

00:26:21
implementing agents as sort of employee equivalents that are

00:26:25
running around spending budget making decisions.

00:26:28
And so that conversation, to me is sort of a signal of hype

00:26:31
getting out of control. Yeah.

00:26:33
As a daughter of as a daughter of a lawyer who works in this

00:26:37
kind of space, I can just report back that a lot of these

00:26:39
companies are a little bit risk averse in letting agents run

00:26:43
rampant, even if they are in theory very excited about this.

00:26:46
So I do think it's going to be kind of a multi year adoption

00:26:49
curve on this one, but it could mean that the hype right now is

00:26:52
a little bit too much. So far as I can tell, the

00:26:55
biggest adopters this stuff is like the makers of the

00:26:57
technology themselves who seem to be all about laying people

00:27:00
off in order to like take advantage of the efficiency.

00:27:03
Like Microsoft just laid off like another several 1000 people

00:27:07
and it really seems like were. They middle managers or which?

00:27:09
Which tier were they? Do you know?

00:27:12
They had they had a middle manager layoff a couple months

00:27:14
ago, but this one was mostly in the sales org.

00:27:16
Do you know what percentage? Because I, I, one of my, you

00:27:18
know, the other version of the newcomer podcast is the through

00:27:21
Bull Valley podcast, which we've been recording with Max and

00:27:24
James. And in that we have these

00:27:26
predictions and I think one of mine is that people, a Fortune

00:27:30
500 will attribute layoffs to AI.

00:27:34
So I I'm I'm unfortunately for workers rooting I think for a

00:27:38
mass AI attributed layoff in terms of my prediction becoming

00:27:42
true. Just for just for the the Cal C

00:27:44
bet. Just so I could say I was right,

00:27:47
you know? Yeah, Microsoft was not explicit

00:27:50
about it at all, but there were some interesting pieces that

00:27:52
came out afterwards that like in the meeting after they laid off

00:27:55
all their, you know, 50% of the sales team in certain

00:27:59
departments, you know, the head of the unit, the CVP was like,

00:28:02
by the way, you guys should really start using AI more.

00:28:05
It can make your life a lot better.

00:28:06
And so there, you know, I don't we're not quite like the quiet

00:28:09
part out loud, but it seems like we're at least with those

00:28:12
companies like leaning towards that they want to set an

00:28:14
example. Here's how you lay people off

00:28:16
when you when you use our products.

00:28:18
Just mood check before we move on to the next subject.

00:28:21
Are you bullish or bearish on AI over the next six months?

00:28:25
Six months. I, I think there's wait.

00:28:28
Oh, it's, it's, I mean, it's gonna continue to grow up to go

00:28:31
up. There's no indication that

00:28:32
things are going off a Cliff. All right, same.

00:28:35
We're all bullish in six months, but maybe that means we're all

00:28:37
blind. We're all bought into the hype.

00:28:39
You know the the air comes out at some point.

00:28:42
You got to do better than the ramp data for me to just decide

00:28:44
to be bearish though. I agree.

00:28:46
We're that's why this is an impressionistic certain signals

00:28:48
to pay attention to. We're not saying this this thing

00:28:51
is a bubble. All right, the last topic, an

00:28:54
exciting one. Spicy.

00:28:56
Complaining about tweets, Tom, you're all fired up about this.

00:29:00
Somehow. You are OK with Mecca Hitler,

00:29:02
but you're can't stomach Sequoia's Sean McGuire.

00:29:06
I can't, I can't deal with Sequoia, his earnest.

00:29:08
Political has political views. Sean McGuire is the guy I don't

00:29:14
know personally. I have, you know, I've met many

00:29:16
Sequoia partners. I've gone to the Sequoia spring

00:29:19
mixers and fall mixers and there's quite a lot of people

00:29:21
there that I like. I don't think Sean McGuire is

00:29:27
oh, I'm sorry, let me, let me before we get into that.

00:29:29
He has been going off on Twitter for about two years almost,

00:29:33
mostly with his hardline right wing, pro Israel politics.

00:29:37
He likes to make people mad. This is very exciting to him.

00:29:42
He really got into hot water, so to speak, over the 4th of July

00:29:46
weekend when he, after the story in the New York Times about

00:29:51
Zaran Mamdani maybe being misleading on his Columbia

00:29:55
application, Sean goes out there and says, you know, Mamdani is a

00:29:59
liar because he's an Islamist. And this is culturally baked

00:30:02
into what Islamists do. And then people got mad because

00:30:06
it really sounded like he was saying all Muslims are liars,

00:30:09
culturally speaking. And then he goes out and says

00:30:11
that's not true. I wasn't talking about all

00:30:13
Islamists. They're all, all Muslims talking

00:30:14
about Islamists. And then that became a whole

00:30:16
conversation about what it means to be and whether or not Sean is

00:30:19
a racist. And I don't really know.

00:30:20
And Bill Ackman and people have rallied around.

00:30:24
Sean was offense. I saw Josh Wolf also rallying to

00:30:27
his defense. Too Josh Wolf.

00:30:30
There was a petition of people condemning Sean.

00:30:32
There was a counter petition of people supporting Sean with

00:30:35
signatures from all walks of this world.

00:30:38
We we, we're not cancelling Sean McGuire that that is that is

00:30:43
we're being very clear. And I think you you do a good

00:30:46
job of pointing this out, Tom. But like, yeah, the one thing

00:30:50
everybody in Silicon Valley can agree on is no more

00:30:52
cancellations. And we're we're we're sticking

00:30:55
to that Armistice. Like we're not trying.

00:30:57
Nobody on this podcast is trying to cancel Sean McGuire.

00:31:02
Yeah, the idea of someone losing their job because of their

00:31:04
opinions in general, I'm, I'm generally against it.

00:31:07
I think that's not a great thing for societies to have happen.

00:31:09
I wouldn't advocate for it, I don't think, Sean.

00:31:14
So OK, before I get into my personal opinion, I listen to a

00:31:17
podcast interview Sean McGuire did with Jack Altman.

00:31:19
Second time he's come up on this episode when which he talked

00:31:21
about his decision to kind of come out as an opinion haver and

00:31:26
talk about all of his right wing politics.

00:31:28
And he was saying after October 7th, you know, and he Sean was

00:31:32
pulling from his history. I think he worked for the DoD.

00:31:34
He understands information warfare and the way that

00:31:37
misinformation can be spread online.

00:31:39
It was extremely important for him to start talking about his

00:31:43
politics and he even said on this podcast on that, on that

00:31:46
episode, I'd be willing to die for, you know, for the things

00:31:49
that I believe and I don't advocate for that at all.

00:31:52
That would be awful. I don't really think the things

00:31:54
that Sean is saying are necessary to say.

00:31:58
I don't really think. And he has a pretty large

00:32:00
platform. He has 250 followers.

00:32:03
He really thinks that he is putting important stuff into the

00:32:05
discourse based on that podcast and the fact that he keeps going

00:32:08
on about this stuff. And this is very common now

00:32:11
within Silicon Valley. Oh, it's very common.

00:32:13
There are certain VCs that have decided that their platform and

00:32:16
their takes are so important that they want to build a whole

00:32:18
media empire around it. And we've got, you know, the

00:32:21
best version of that is the all in guys who have built an

00:32:23
actually successful show with their takes on things.

00:32:26
Mike Solana, who I also don't know, he's not really an

00:32:29
investor, but he's in, you know, he's obviously in the VC world.

00:32:32
He's got. CMO.

00:32:34
Yeah, yeah. He's not AVC, but but you know,

00:32:36
whatever he's he's in that world.

00:32:37
He built Pirate wires, which is like actually an interesting

00:32:40
publication. And I and I totally give him

00:32:42
credit for all of that. You're basically saying, Sean,

00:32:45
if you're gonna make us listen to you constantly, these, you

00:32:48
need to be a little more creative.

00:32:49
It can't just be like have. Something interesting to say.

00:32:52
Like yes, have something. More interesting to say than

00:32:55
just like what I. Find racially tinged things that

00:32:58
get everybody into fury. It's sort of like the lowest

00:33:00
common denominator way to get people.

00:33:02
To right, It's a hack. It's a hack to get people to to

00:33:05
to think that you're interesting because you're saying stuff

00:33:07
that's kind of pushing the envelope in a racial way and,

00:33:10
or, or whatever way and culture of a bigoted way.

00:33:12
And I just think if it's that important to you, Sean, that you

00:33:15
really want to like be flooding those of us who want to maybe

00:33:19
see your takes on other things, you know, within the the tech

00:33:22
world, talk about your politics, start up a sub stack, like let's

00:33:24
just see how interesting your views really are.

00:33:27
Like build up a little media empire around you and see if

00:33:29
people actually care. Because he believes in all sorts

00:33:31
of conspiracies too. He's like.

00:33:34
Yeah, he does have his election denier stuff.

00:33:37
I mean, I guess I could just be furthering your take, Tom, with

00:33:41
this. I first of all, clearly it

00:33:43
hasn't affected Sequoia's brand standing at all.

00:33:47
So anyone who's concerned about cancellation, the attendant, you

00:33:50
don't think so? I.

00:33:51
Think it hurts the brand a little bit or like it's looms so

00:33:54
large in terms of what Sequoia's brand is like.

00:33:56
I think of Sequoia again, I don't wanna go after Sean or

00:34:01
have them tell him to stop, but I do think just from a brand

00:34:04
perspective, it's one of the loudest parts of their brand

00:34:07
right now. But I think they're still really

00:34:09
regarded as, yeah, I think they're still, I think the

00:34:12
results is overwhelmingly supersedes this.

00:34:16
Like I really don't think it's hurt at all.

00:34:18
And I think if you're kind of in the ecosystem, especially where

00:34:21
maybe you hold some of these views too, or you're like a

00:34:24
little bit enjoying the kind of vocal trolley posture, which I

00:34:28
will say is kind of a case for a lot of people in certain sectors

00:34:32
and different industries. Like it could be a value add for

00:34:35
you. You could be like signaling

00:34:36
like, hey, we don't care. People think we just want the

00:34:38
returns. And so I don't think it hurts

00:34:40
them really at all. And so, yeah, let me.

00:34:44
Make a couple quick points. I mean, 1, you know, being close

00:34:48
to Elon and being in having somebody in this sort of

00:34:51
conservative troll world gets you a certain contingent of

00:34:54
people. And so if Sequoia can get away

00:34:56
with, OK, we're the normie guys, but we also are friendly with

00:35:01
Elon world, you know, that's, that's there's a financial play

00:35:04
there. And I think that's part of

00:35:06
what's going on. The other thing I just want to

00:35:08
say on this conversation is that, like, kudos to Amjad, the

00:35:11
CEO of Replit, I'll put it that way.

00:35:13
I mean, he's been very bold in, you know, voicing unpopular

00:35:20
opinions. And I think Sean represents

00:35:22
himself as like articulating this sort of like, I don't know,

00:35:25
out of consensus point of view. Well, but Amjad is an

00:35:30
interesting case here because, you know, he went on Joe Rogan a

00:35:33
couple of weeks ago, talked about all, all kinds of things.

00:35:36
But, you know, within that he talked about, you know, his

00:35:38
being of Palestinian descent and he's Jordanian.

00:35:42
And he was like, yeah, it's not easy being, you know, one of the

00:35:45
few people in Silicon Valley willing to speak up, you know,

00:35:47
speak out against Israel and and for the Palestinian.

00:35:51
Yeah, right. Paul Graham is an interesting

00:35:52
case and all of that, but it feels more personal with Amjad.

00:35:57
And he's gotten into it a little bit with Sean on X.

00:36:01
And Sean completely took out of context something that Amjad had

00:36:04
said and cut off the date and like a Washington Post article

00:36:07
so that he could claim that what he had said on Rogan was

00:36:09
inaccurate. And you know, that's standard

00:36:11
trollish behavior on on she's part like he was trying to get a

00:36:14
rise out of people. I don't even think I'm John.

00:36:16
Is that honestly, like, you know, the politics that he was

00:36:18
describing on Rogan doesn't even really conform with like the

00:36:21
most leftist stuff. He was talking about two state

00:36:23
solutions. He was, you know, like I I care

00:36:27
a lot about what happens to the Israeli people.

00:36:29
I mean, that kind of stuff would not actually, you know, get

00:36:33
endear you too much to the left. Sure.

00:36:35
But the stuff you're hearing on tech Twitter right now is like,

00:36:39
just like, let's send every weapon we have over to Israel

00:36:42
and like, let let the chips fall.

00:36:44
They may. Sure, Joe Lonsdale is probably

00:36:46
the extreme version of that. And and like, I don't know, I

00:36:48
should be interested in your take, Eric, but like Sequoia,

00:36:51
you know, with Doug Leone as the head of the firm also like

00:36:54
fairly well known conservative like as the firm, they certainly

00:36:58
if you want to look at the personality.

00:37:00
They had Doug Leone and they had Mike Moritz, both really strong

00:37:03
investors, opposite sides of the political spectrum.

00:37:07
Clearly Doug Leone won. I mean, Moritz isn't even part

00:37:10
of the firm anymore. And it seems like that ideology

00:37:14
is 1. I mean, I, I Rudolf doesn't talk

00:37:16
about his politics a ton, but my sense is he's on the more

00:37:19
conservative side of things. And so yeah, I don't, I don't

00:37:23
see who the beacon of liberal or left wing thought of, at least

00:37:28
publicly as Sequoia is right now.

00:37:30
And does do you think Doug being there gives Sean a lot more

00:37:33
cover to be able to to say the stuff that he does?

00:37:36
I mean, my first piece for the newsletter was about, you know,

00:37:39
him being a Trump watching conservative.

00:37:41
It was about China and about how this what the circle the center

00:37:46
could not hold with Sequoia having China while having anti

00:37:50
China sort of anyway, I was I was right that China spun out.

00:37:53
But anyway, that was my first piece and it, it talked about

00:37:57
how Doug Leone was a big Trumper.

00:38:00
And so, yeah, I, I mean, Doug's not in charge anymore, but he's

00:38:03
still around the firm. And I think that the fact that

00:38:07
if if Sean was so out of step with the rest of Sequoia, it

00:38:11
wouldn't last. Yeah, but he also kind of wants

00:38:14
to be maybe fired a little bit, a little bit like, I mean, look,

00:38:18
he I don't know, I, I, I mean, on that podcast he talked about

00:38:22
the fact that, you know, he's independently wealthy now

00:38:24
because he had sold the company earlier.

00:38:26
So he has a lot of money. So he's able to, you know, take

00:38:30
these risks, these risks with his, with his opinions.

00:38:33
And then, you know, he changed his Twitter bio or his ex bio

00:38:36
for a bit to say like Sequoia partner in parentheses for now.

00:38:39
Like he's he's certainly baiting people to to and, you know, he

00:38:43
talked about how the the petition, you know, was was

00:38:47
another example of cancel culture.

00:38:49
And these people love to be, you know, victims before the fact.

00:38:54
So yeah, I don't know if like I don't, I don't even talk to

00:38:57
Sean. I don't know.

00:38:58
Why? It would it would play into his

00:39:00
martyr. This Islam is narrative Muslim

00:39:03
distinction like why? Why, why why say something

00:39:08
that's so easily misinterpreted? I guess is my my issue here.

00:39:12
Because he's a troll, right? And I hate that kind of

00:39:15
communication. I just like I, at my core, want

00:39:18
to speak sort of earnestly and openly and say be heard as I'm

00:39:24
what I'm intending to say, you know?

00:39:26
And I feel like he's trying to hack engagement.

00:39:29
And that is what I object to. Right, right.

00:39:33
No. And that was sort of the point

00:39:34
of my column is, is I don't even really deal with the specifics

00:39:37
of what he's saying and interrogating the logic.

00:39:39
I mean, that's a whole other thing.

00:39:40
But like, he is kind of claiming that he is on this holy war to

00:39:45
put his opinions out there because it's extremely

00:39:47
necessary. And in fact, I don't think these

00:39:49
opinions are that surprising. I don't think they're all that

00:39:51
necessary. And he's hacking the system

00:39:53
anyway to get them out there. Like he's not really saying.

00:39:56
And he's friends with the guy who runs the platform, the guy

00:39:58
who's shifting his own AI to bring this full circle to sound.

00:40:02
More like him, you know? It's like, oh, I could be one of

00:40:04
the voices that says the things that Elon wants me to say, you

00:40:08
know? Yeah, there, there are certain

00:40:09
people there, there are certain people who I think have been

00:40:12
elevated by the Elon, you know, Elon era Twitter and the 4U page

00:40:17
and the algorithm and it's like Sean McGuire is one of them.

00:40:20
So also is the die work wear guy but but you know different

00:40:24
different sides of the. I mean, one of the most iconic

00:40:26
things that has been said on this podcast is Ben Smith saying

00:40:30
we're all in the courts of billionaires.

00:40:32
And I think that theme has continued to play out.

00:40:35
And yeah, Sean Mcguire's in the court of Elon Musk.

00:40:39
Right. And, and, and benefited from it

00:40:41
in lots of ways financially and also like in this little mini

00:40:45
unimpressive media thing that he's put together.

00:40:48
But we are not cancelling you, Sean.

00:40:50
Keep tweeting. Godspeed.

00:40:51
Just make him less hate, more more thought.

00:40:55
Well, he's not hateful only towards Islamofascists or

00:40:58
Islamists those who can hate which.

00:41:00
Which it is offensive. It is offensive to say that

00:41:04
Mamdani is an Islamist. Like there, there's nothing

00:41:07
there's. It's like he's a normal mayoral

00:41:10
candidate. Like I feel like we can't let

00:41:12
that part of it slip by. Like it's just like.

00:41:14
That point like it is just blatantly false.

00:41:16
Also, it's very it's, it's just, I don't want to let that slide.

00:41:20
Have you done the research though, Madeleine?

00:41:22
Have you done done the research? I'm familiar with who his father

00:41:26
is, is a professor at Columbia. I'm, I will just say I don't

00:41:29
think that Zoran Mamdani is an Islamist and there's no evidence

00:41:34
to that to be supported and to say that is inherently

00:41:37
inflammatory. Correct me if I'm wrong, by the

00:41:39
way. As the non New Yorker over as

00:41:42
the as the non New Yorker on this pod, as the non New Yorker

00:41:45
on this podcast, He didn't even really run on like he didn't run

00:41:48
like an identitarian campaign. Right now that's all the right

00:41:53
wing effort now is to basically say you were misled like this

00:41:56
happy go lucky campaign that you saw.

00:41:59
That's not who he really is. It's all bullshit.

00:42:02
I'm sorry he ran a campaign on affordability.

00:42:04
But anyway. I feel like the substance of it

00:42:07
matters, you know, not just the optics.

00:42:08
Anyway, that's our episode. We'll see you next week.

00:42:12
Thanks for listening. See you next week with Sean

00:42:13
McGuire Co hosting and I'm. Sure.

00:42:16
There you put that on to the world.

00:42:18
There's some percentage. Is Brock still too woke next

00:42:21
week? All right, see ya.

00:42:24
Bye.