What do Sam Altman, Jensen Huang, Reid Hoffman, Marc Andreessen, and Elon Musk actually believe about the future of tech?
In this episode of the Newcomer Podcast, we break down the quotes that defined tech in 2025. From OpenAI and Anthropic to venture capital, regulation, and Silicon Valley power, these are the moments where powerful people said the quiet part out loud.
Rather than reacting to headlines, we look at the specific lines that revealed how AI companies think about compute and money, how venture capital is consolidating power, and why tech and politics are now inseparable.
We cover:
What Sam Altman and OpenAI revealed about scale and compute
How VC giants like Andreessen Horowitz and Lightspeed talk about power and access
Why AI regulation looks very different in public than it does in private
The quotes that mattered more than any keynote or earnings call
This is a year-in-review told through the words that shaped it.
00:00:00
What do Sam Altman and the Pope have in common?
00:00:02
They're both on newcomers list of the top quotes from 2025.
00:00:06
We've got 18 quotations from business leaders, tech Titans,
00:00:11
politicians and industry insiders.
00:00:13
Everything from Jensen Wong talking about America winning
00:00:17
the AI race to Elon Musk signaling the demise of his
00:00:20
relationship with Donald Trump, and even Sam Altman talking
00:00:23
about his GPT parenting skills with Jimmy Fallon, Senators, the
00:00:27
Vice president, and even the Pope hope we'll all make an
00:00:31
appearance on our list of the most important and impactful
00:00:34
quotes of 2025. This is the newcomer podcast.
00:00:45
Welcome to the newcomer podcast. One of our favorite things here
00:00:49
is to look back at the year through quotes.
00:00:53
Some spicy, some profound, some inane, some we think are
00:01:00
embarrassing. We are looking for a mix of
00:01:03
quotes from high profile speakers.
00:01:06
We've got, you know, everybody from Sam Altman to the Pope in
00:01:10
this list. Madeline Renbarger, yours truly
00:01:14
on the podcast, is the main compiler here.
00:01:17
So we'll she'll she'll help explain which ones we picked.
00:01:22
I'll. Have to defend myself on the POD
00:01:24
for why I think this is the most important quote of the year.
00:01:28
Exactly. And we we weighed in and then
00:01:30
and you'll have Tom Doton's melodious voice reading the
00:01:34
quotes to you. I like to give Tom all the
00:01:37
reading assignments. He's he's our audio book kind.
00:01:40
Of sure, yeah, Popcorn. In a different life, though, I
00:01:42
think you could have been an audio book reader.
00:01:46
So yeah, we'll, we'll get into it.
00:01:47
We have 18 quotes here. I think you're gonna get a great
00:01:50
snapshot of the wild 2025 and maybe that'll give us a little
00:01:56
bit of a lens into what to look for this year.
00:02:00
It will be no surprise that artificial intelligence looms
00:02:02
large. This podcast, obviously we
00:02:05
talked about AIA lot, so not surprising we'd be interested in
00:02:07
those quotes. But I think, you know, we're
00:02:09
we're quoting the Pope here. We're quoting, you know, the
00:02:12
vice president. So yeah, without further ado,
00:02:14
let's get into the 18 quotes that defined 2025 from CE, OS,
00:02:19
VCs, political leaders, and everybody else.
00:02:23
I'll do my best to read these in character too, OK?
00:02:26
That wasn't even the mandate. Well, we'll see if that if that
00:02:28
works even anyway. OK, we stand now with the
00:02:31
frontier of an AI industry that is hungry for reliable power and
00:02:35
high quality semiconductors. Yet too many of our friends are
00:02:38
deindustrializing on the one hand and chasing reliable power
00:02:42
out of their nations and off their grids with the other.
00:02:45
The AI future is not going to be won by hand wringing about
00:02:48
safety. It'll be won by building from
00:02:51
reliable power plants to the manufacturing facilities that
00:02:53
can produce the chips of the future.
00:02:55
That is JD Vance on the need for less AI regulation at the Paris
00:02:59
AI Action Summit, February 11th, 2025.
00:03:03
Madeline, why did this stand out to you?
00:03:05
Well, this quote I felt like was really kicking off the year with
00:03:09
the US on the global stage and JD Vance of all people telling
00:03:13
Europe that they need to LAX their AI regulations and get out
00:03:16
of the way for American innovation.
00:03:18
And I felt like both politically on lots of fronts, but the tech
00:03:22
industry at large in our era of effective acceleration, ISM, you
00:03:25
had, you know, the commander in chief's, you know, second in
00:03:29
command there basically saying, yeah, let's go.
00:03:34
It's time to build Marc Andreessen, you know, in front.
00:03:37
Of everyone, no safety net. No safety and the safety
00:03:40
regulations are only hindering, you know, our future.
00:03:43
And in many ways also it was not a very global friendly speech.
00:03:47
Other parts of the speech, you know, kind of shit on, you know?
00:03:52
Right, kicking off American antagonism towards Europe.
00:03:55
Basically, we're like, yeah, but we're going to do this, and you
00:03:56
better get out of the way. It was very, you know,
00:03:59
isolationist as well. I mean, to me, this is just JD
00:04:02
Advance and the administration doing full business capture of
00:04:06
the tech industry. This is what they paid for.
00:04:08
This is why they wanted him in office.
00:04:10
They wanted no regulation. They wanted tech to be in
00:04:12
control of all policy writing. They have the quasi VC as vice
00:04:17
president going up there and telling Europeans that they
00:04:20
should not regulate any of our Great American companies.
00:04:23
It's not going out there talking about Mistral.
00:04:25
It's not telling these people don't don't be regulating
00:04:27
Mistral and fluid stack like and lovable.
00:04:30
Like they want to make sure that open AI, which is, you know, Sam
00:04:34
lobbied very hard to get, you know, positioning in the
00:04:36
administration to to to leave us alone.
00:04:39
So hand wringing about safety. EW Gross.
00:04:42
We don't want, I mean, what this is in February, you know, a
00:04:44
couple things happen. We recently we got the AI
00:04:48
executive order trying to ban states from regulating AI.
00:04:51
Who knows how effective that will be.
00:04:54
So it's not just trying to block Europe, it's trying to tell
00:04:57
states no regulation. There was an interesting piece,
00:05:00
I, I believe it was in Politico, about how some of the tech
00:05:04
lobbyists are worried that David Sachs and crew are overplaying
00:05:10
their hand on AI friendliness. That basically by saying no
00:05:14
regulation, there's going to be such a huge backlash that this,
00:05:18
this strategy of saying, you know, no regulation for AI is
00:05:22
only going to create more enemies.
00:05:25
And I think that that'll emerge as we keep going through these
00:05:27
quotes. I mean, Tom, you had a good
00:05:29
story earlier this year about the AI industry being really
00:05:32
upset about Texas law, about energy getting cut off to data,
00:05:36
AI data centers first in the state.
00:05:38
So it's not a given that the steamroll no regulation AI
00:05:42
policy will even fly with Republican senators too.
00:05:46
Yeah. And it's notable that this quote
00:05:47
is from February. I think a lot has changed in the
00:05:50
way people view AI, specifically its political valence.
00:05:53
And at the end of the year, we had people like, I think Marco
00:05:57
Rubio and probably Josh Hawley talking about concerns over AI
00:06:02
and the effect it was going to have on jobs and power.
00:06:04
And, you know, like inflation becoming a major issue and
00:06:08
whether or not it's related to like the cost of power and
00:06:10
whether AI is driving that up became really problematic.
00:06:13
So things I would be interesting, Obviously, low
00:06:16
regulation, that's probably going to remain consistent.
00:06:18
But the idea that, like AI and, you know, is aligned with the
00:06:22
Republican Party is not really the case.
00:06:26
Yeah. All right.
00:06:27
Let's let's move on to our second quote of 2025, Tom.
00:06:31
All right. For some of these companies, for
00:06:34
a lot of our customers, it will be existential kind of life and
00:06:37
death decisions. And that is Ryan Peterson, a
00:06:41
podcast guest at one point, right, Eric?
00:06:44
Yep, Flexports CEO. He's talking about how President
00:06:47
Trump's tariffs will impact large retailers that rely on his
00:06:51
logistics software at Strictly VC in San Francisco on April
00:06:55
7th. Ryan Peterson is like Silicon
00:06:58
Valley's Tariff and Trade Peru running one of the most real
00:07:03
world startups in silicon. Valley, I was going to say, I
00:07:05
picked this one in part because he's, you know, the logistics
00:07:09
infrastructure guy in Silicon Valley.
00:07:11
He's been incredibly successful with growing Flex port and it's
00:07:14
a startup that is really dependent on government trade
00:07:18
policy. It's a logistics of software
00:07:21
company handles these big deals. So he became, you know, Silicon
00:07:25
Valley's point person, like you said, Eric, during the tariff
00:07:28
fiasco. Like how does this affect us?
00:07:30
How does this affect us? He went on, you know, lots of
00:07:32
shows explaining but and live tweeting as the policies rolled
00:07:36
out. And what's interesting is that
00:07:38
was tariff moment #1 the tariffs have changed quite a bit
00:07:41
throughout the year too. That was a few days after
00:07:44
Liberation Day, right? Yes, fresh.
00:07:46
Liberation Day. We were liberated then.
00:07:48
I will say, I mean right now the news of today at least is GDP is
00:07:52
doing well. I think the New York Times
00:07:55
headline was GDP is doing well despite tariffs.
00:07:58
And I don't know, it was Sax or one of the Trump aligned people
00:08:01
was sort of snarking the New York Times for putting tariffs
00:08:04
still in the headline when it's like OK, GDP seems to be OK.
00:08:08
So I don't know, has has the existential crisis manifested?
00:08:14
Well, I mean, this sort of is getting away from AI and tech
00:08:18
specifically, but like, I think there's an incredibly pervasive
00:08:22
mood among the American public that the economy isn't doing
00:08:26
well right now. And whether that's psychological
00:08:28
or real is sort of unclear. And I don't want to get in like
00:08:31
the will stancil fight of like, people don't really know what it
00:08:34
means for prices to go up. But Trump has bad approval
00:08:38
ratings right now. One of his lowest.
00:08:39
God, I feel like Jason Calcanis, like one of his lowest ranking
00:08:43
categories within this is the economy.
00:08:45
People are not happy about basically rising prices.
00:08:48
And one of the things that they point to were the tariffs.
00:08:52
And so was it existential for some of these companies?
00:08:54
I don't know. You've seen retailers raising
00:08:57
prices, blaming the tariffs for their reasons for it.
00:09:01
So like, I think it's contributed pure life and death.
00:09:06
Yeah. If they continued at the insane
00:09:08
Liberation Day levels, it probably would have been
00:09:11
incredibly difficult for retailers to to make things
00:09:14
work. Right.
00:09:15
There have been several backroom deals that have lowered them on
00:09:18
specific countries and concessions that other countries
00:09:21
have made where it's not kind of as blanket as it was on
00:09:24
Liberation Day for a lot of the US biggest trading partners.
00:09:27
So maybe we do it less, but you're totally right that
00:09:30
consumer sentiment is down, down, down on the economy.
00:09:34
Even if people are still spending, they're very unhappy
00:09:37
about what the the prices they're paying. 2025 was a year
00:09:41
living under the cloud of tariffs, which I don't think any
00:09:44
serious person think helped us. And the president should thank
00:09:48
God that all this AOAI infrastructure spending has just
00:09:52
been a gift to the economy, You know, just like build, build,
00:09:56
build for a still somewhat uncertain technology.
00:10:00
And yeah, the Trump administration got lucky on that
00:10:03
one as they were creating their own self-inflicted wound with
00:10:06
tariffs. Let's go to #3.
00:10:10
All right. Today, the Church offers its
00:10:11
trove of social teaching to respond to another industrial
00:10:14
revolution and to innovations in the field of artificial
00:10:17
intelligence that pose challenges to human dignity,
00:10:20
justice and labor. That's Pope Leo, the woke Pope
00:10:24
in a speech to Cardinals two days after being sworn in.
00:10:27
Yeah, I picked this one because, well, one thing I found
00:10:30
interesting about the woke Pope is that he picked the name Leo,
00:10:33
and he said this in relation to the first Pope.
00:10:37
Leo came about in early, you know, hundreds AD about
00:10:42
technological upheaval and he was he's really focused on
00:10:45
artificial intelligence as an issue that will have profound
00:10:49
social and religious impact on the world.
00:10:52
And he's paying very close attention to it.
00:10:54
And it doesn't surprise me, frankly, that the first American
00:10:57
Pope ever is, you know, tuned into American innovation and
00:11:01
what's going on in our business sphere over here.
00:11:03
But I picked it because he's expressing that he's gonna be
00:11:08
focused on this throughout his papacy as a very big issue.
00:11:13
And then what's? The other do we consider the do
00:11:16
we consider the other quote wasn't there There was a Pope
00:11:19
quote later in the year. I mean, this is a good one where
00:11:22
the Pope said something equally benign about AI and like Marc
00:11:27
Andreessen shit on it. And then Marc Andreessen had.
00:11:31
A quitter beef fight with the Pope, right?
00:11:34
Because I think we should put that one in this one.
00:11:35
I sort of forgot about that one. Can you pull that one up too?
00:11:39
Yeah, I can pull that one up. I picked this one because this
00:11:41
was his first speech to be so this was we're talking about.
00:11:45
AI in my. Yeah.
00:11:48
This Pope has said relatively mild things about, you know,
00:11:50
being thoughtful about AI. He picked his name after
00:11:53
somebody who navigated, not resisted necessarily another
00:11:57
technological revolution. And some of the tech community
00:12:00
can't even seem to stomach like, hey, we should.
00:12:03
We should be moral about this thing, which seems like the
00:12:06
Pope's job to remind us. I always have mixed feelings
00:12:10
about conflating morality with the technology because with AI,
00:12:15
it makes me worry that, like, people are talking about the
00:12:18
morality of the AI itself, like, it is imbued with some sort of
00:12:22
intelligence and emotional capabilities when really it's
00:12:25
just the humans that are in control of it.
00:12:27
And you know, how readily you want to implement this
00:12:30
technology basically to displace people.
00:12:33
And which is why labor is an important thing for him to bring
00:12:36
up. You know, I don't want to, like,
00:12:37
take Marc Andreessen's side. It was such an obnoxious tweet.
00:12:42
Well, the tweet wasn't I. OK, I finally did pull up the
00:12:44
tweet. We can't even quote it because
00:12:45
it was a GIF reaction, but it was it.
00:12:49
Was the woman it? Was the interviewer from the
00:12:51
Sydney Sweeney interview about her great jeans when she asked
00:12:54
her about the American Eagle Act?
00:12:55
So it's like reference. The Pope is tweeting about AI.
00:12:59
Marc Andreessen replies with a reaction image from the Sydney
00:13:02
Sweeney interview controversy. It was like, come on.
00:13:06
Yeah, it all it all sort of sucked.
00:13:07
And I know we'll get as this progresses like more and more
00:13:10
into like the tech MAGA, right, and how they fared throughout
00:13:13
the course of the year. So I don't want I don't want to
00:13:15
blow it all here. But, you know, good on the Pope
00:13:18
for talking about labor, because that really is eventually going
00:13:22
to be one of the central inanimating forces of what AI
00:13:25
means for the economy. And to put that front and center
00:13:28
and say we need to talk about that as much as possible seems
00:13:31
like a good idea. It's important.
00:13:33
I my issue on labor is always is it taking the benefits of AI and
00:13:37
restructuring our safety net and figuring out new jobs or
00:13:40
whenever you hear labor and AI you think like, well, you know,
00:13:44
local in New York just vetoed this legislation that was meant
00:13:49
to require 2 MTA workers instead of just one, right?
00:13:53
Like I don't think the answer should be creating make work
00:13:57
jobs or handicapping AI so. That right people, and I think
00:14:01
Sam, Sam has talked about that. Too.
00:14:05
Yeah, saying like, you know, some of these jobs that are
00:14:09
gonna be replaced by AIS weren't even really jobs in the 1st
00:14:12
place and we should kind of question their validity.
00:14:14
And he got a lot of shit for that.
00:14:16
I actually think that's true to an extent.
00:14:19
I think there are a lot of like make work is OK, Sure, we can
00:14:22
call them make work jobs, but it's like basically things that
00:14:27
were created to give people thing that is make work.
00:14:29
Like, you know, e-mail jobs. Basic.
00:14:31
Bullshit jobs. I think there's a whole book
00:14:33
about it #4 our 4th quote of 2025.
00:14:38
I think it didn't exist. I think it was a fiction to some
00:14:40
extent. That was Jackie Reese's at the
00:14:43
Breaking the Bank summit hosted by Newcomer.
00:14:45
And this was on the narrative that the Biden administration
00:14:48
was de banking institutions affiliated with
00:14:50
cryptocurrencies. That was from May.
00:14:54
Yeah. So Jackie Reese's CEO of Lead
00:14:56
Bank, she was basically refuting kind of a narrative that had
00:15:02
come up in this sort of crypto wave of the election that the
00:15:06
Biden administration had been actively, you know, shutting
00:15:09
down these companies that wanted to do crypto.
00:15:12
And this narrative that like, oh, we really wanted to do this,
00:15:15
but the regulators were way too strict.
00:15:17
And I do think there was some credence to that.
00:15:18
The Biden administration was pretty harsh on crypto
00:15:20
companies. But, you know, five months after
00:15:22
Trump is sworn in, you have, you know, the leader of this
00:15:25
financial institution who's very involved with these companies
00:15:28
basically saying, yeah, it, that didn't really happen.
00:15:30
That wasn't true. I don't know if I believe her
00:15:33
and obviously I am politically inclined to believe her and say,
00:15:37
oh, the Democrats problems were overstated.
00:15:40
But I don't know. I, I think the Biden
00:15:43
administration was clearly pretty hostile to
00:15:45
cryptocurrencies. And some of these companies had
00:15:48
trouble connecting with the financial system partially
00:15:52
because they were probably doing things that might, might not
00:15:56
have been legal. But I I think the Biden
00:15:58
administration would have been much better served to say more
00:16:01
stuff was bad and illegal than to let them operate in this
00:16:07
terrifying Gray area which created all sorts of problems.
00:16:10
So, I don't know, subjective call.
00:16:13
This is why I picked this. Quote though because I like not
00:16:15
everyone will agree with her statement.
00:16:17
Exactly. You know Eric has a dissenting
00:16:19
take. All right #5 quote of the year.
00:16:22
I was disappointed to see the massive spending bill, frankly,
00:16:25
which increases the budget deficit, not just decreases it
00:16:28
and undermines the work that the Doge team is doing.
00:16:31
That's Elon on the Republican that the big beautiful bill,
00:16:36
which kind of sets up the big breakup between him and Trump,
00:16:40
and that's in May. That exactly.
00:16:43
That was the line that set off the fight, and it's hard to
00:16:47
quote massive Twitter and Truth Social sub tweeting back and
00:16:51
forth in a newsletter in the same way.
00:16:54
So I figured this could be a good signpost for the moment.
00:16:57
But I would say as a moment in tech, obviously without a doubt
00:17:00
one of the biggest stories of the year and most significant
00:17:04
things happening was the Trump Elon Doge fallout and RIP Doge.
00:17:09
Yeah, and now the consensus is basically Doge did nothing of
00:17:13
value, certainly didn't bring down.
00:17:15
They barely cut the debt set at all.
00:17:18
They really didn't even. It was miniscule compared to how
00:17:22
much the federal government spends.
00:17:23
Obviously this bill is part of that because it increased
00:17:25
spending, but I think it's not controversial at all to say DOGE
00:17:28
was a failure. I want to take a little bit of
00:17:31
credit on this show. This tweet from Elon comes in
00:17:35
late May and I know on this show this was in the wake of the
00:17:40
Wisconsin election that Elon had spent millions of dollars on
00:17:43
that blew up in the Republicans faces.
00:17:45
And I think I said on the show, I think this is the beginning of
00:17:48
the end of Elon's political influence.
00:17:50
And really, I think that, you know, the, the, the tables
00:17:53
turning against Trump and like the honeymoon period beginning
00:17:56
of administration falling. And that is absolutely true
00:17:59
turned out to be the case within a couple of weeks or days of the
00:18:03
Wisconsin election. You have the doge fallout.
00:18:06
You have the entire rupture between Elon, I'm sorry, like
00:18:09
the doge starting to collapse, the rupture between Elon and
00:18:10
Trump, Trump's approval ratings start to fall.
00:18:13
And you really start seeing that tech's whole bet on the Trump
00:18:15
administration has looked in some respects really, I don't
00:18:20
know, just not playing panning out for any any length of time.
00:18:24
And so I we're going to talk more about this with the other
00:18:27
quotes, but like this is another one didn't work.
00:18:29
Out and Elon eventually tweets like Trump's in the Epstein
00:18:32
files which obviously very true and sort of consuming today.
00:18:40
I mean you know we're we're sort of swinging back around where
00:18:43
Elon is preparing to fund the Republicans in the midterms.
00:18:47
It seems like Trump and Elon are, you know, healing some of
00:18:51
their disagreements. And so, yeah, in a fairly short
00:18:57
amount of time, we saw Elon being sort of, I don't know what
00:19:00
the favorite son sort of at odd Trump's arm at every opportunity
00:19:04
doing doge access to sort of every file, then falling out
00:19:10
saying Trump was in the Epstein files.
00:19:12
And now somehow they've already repaired the relationship.
00:19:17
Crazy, crazy arc in 2025. Quote number six.
00:19:22
We've never seen anything like the user growth of ChatGPT,
00:19:25
particularly outside the US, and it shows how the global dynamics
00:19:28
of tech and distribution have changed.
00:19:30
That's Mary Meeker from her, I'm assuming annual trend report and
00:19:35
that is the that's in May. Yeah, so this was picked.
00:19:39
Mary Meeker, Bond Capital, writes the Tech Trend report
00:19:42
that sort of sets the tone of every year of what are the main
00:19:46
changes in tech innovation or trends people should be
00:19:49
following. And this was the first year she
00:19:52
actually made the tech trend report, the AI trend report.
00:19:56
Like she was like, this is so big.
00:19:58
This is completely changing the ecosystem.
00:20:00
This is the thing that we need to cover and follow.
00:20:03
And everything will fall forward from that because it's just AI
00:20:07
everything. And I think that's true.
00:20:09
I figured, you know, we should put in the Hall of Fame, speaker
00:20:13
of tech trend truths into our newsletter.
00:20:16
She called it. Exactly.
00:20:18
And it's not, you know, it's revenue too.
00:20:20
I mean, it is Chachi PG has just grown faster than basically any
00:20:26
tech product in the world. And for for whatever worry and,
00:20:31
you know, skepticism we have about multiples and valuations
00:20:36
and profits, it is a rocket ship like none other.
00:20:41
And every investor has reacted to that.
00:20:43
I guess my quick take would be that this is something that the
00:20:46
haters have to reckon with on AI.
00:20:49
Like you can talk about how consumptive it is and you know
00:20:53
its effect on creativity and jobs and all these things.
00:20:56
But to call it a scam, to call it useless, denies the fact that
00:21:00
there are currently around 900 million weekly users on this one
00:21:04
product. And if you want to tell all
00:21:06
these people that they're wasting their time and there's
00:21:08
no value to it because you don't like it, just be honest about
00:21:11
that. But like, and listen, I have a
00:21:14
million problems with AII. Don't know where this whole
00:21:16
thing is going to go. But like the anti AI crowd, has
00:21:19
not really digested what it means for this to be the fastest
00:21:22
growing product of all time. All right, our seventh top quote
00:21:26
for 2025. America's unique advantage that
00:21:29
no country could possibly have is President Trump.
00:21:32
That's Jensen Kwong on Trump's embrace of the AI industry at
00:21:37
the Winning the AI Race Summit in DC from July.
00:21:40
That was a selection in part because I was, you know, I, I
00:21:44
was giving them a handout because I was banished to the
00:21:47
far end of the press room when I went to cover this event.
00:21:51
I of course, I remember that, but I felt like Jensen's public
00:21:56
statements, especially in DCI know we're kind of policy heavy
00:21:59
on the front half of this year. But it was, you know, a pretty
00:22:03
bold declaration that he was cozying up to President Trump.
00:22:07
And obviously this has paid out dividends in the support of
00:22:11
NVIDIA and allowing, you know, not letting the export controls
00:22:15
on the chips and sales to China go through because, you know,
00:22:18
they're best buds, so. We're not we're not doing a Word
00:22:20
of the Year. But if there was one, it might
00:22:22
be sycophants. And no quote embodied this, I
00:22:26
think, better than Jensen's sycophancy for Trump.
00:22:29
And he got what he wanted. Like you're saying what, what,
00:22:33
what is the main thing NVIDIA wanted?
00:22:36
It's to be able to sell these fucking chips to China and make
00:22:39
oodles of money. And Trump came through.
00:22:41
So for shareholders, Jensen did what he needed to do, I mean.
00:22:47
In a moment of intense political cronyism, sucking up to the
00:22:51
president has worked for all these guys.
00:22:54
So why not do it? You know it's a good business
00:22:56
strategy if and if your your whole goal is to help the
00:23:00
shareholders, you're doing it. Yeah, I think Jensen probably
00:23:03
sets the mark for the most out, you know, got the most out of
00:23:07
being an ass sniffer to Trump. Like he got the thing that he
00:23:11
wanted, which was chip sales to China.
00:23:14
That's a lot more clear to me than like regulation, which
00:23:16
probably could have happened without debasing yourself to to
00:23:19
the president. But like Jensen wanted this
00:23:21
thing. He got it.
00:23:21
The stock benefits from it. Good on.
00:23:23
Good on you, Jensen. It's obviously terrible and I
00:23:27
don't know the Democrats and next administration is either we
00:23:30
need some sort of law that prevents CE OS from doing this
00:23:34
or. From what kissing ass?
00:23:37
It's just like what's? The name of the game.
00:23:39
Clearly we can't stop the presidency from from wielding
00:23:43
this. What we need is somebody like
00:23:45
Jensen to feel like, uh oh. Like, you know, if a Democrats
00:23:48
back in charge, I could literally get arrested for this
00:23:50
type of behavior. But I don't know, it's not good,
00:23:53
and we need to figure out how to avoid it when this terrible
00:23:57
clown show finally comes to an end.
00:24:00
Yeah, I mean on the chips ban thing though, it was so poorly
00:24:03
implemented. The cronyism needs to stop and
00:24:06
needs to be illegal. Hypocrisy.
00:24:08
Yeah, and there needs to be social and legal consequences to
00:24:12
it. Obviously people should only be
00:24:15
punished for actual laws they're breaking.
00:24:16
I'm not saying we should just make up laws post hoc and punish
00:24:20
people, but it's deeply troubling.
00:24:23
Quote 8 for 2025. I left this tour with a distinct
00:24:27
feeling that AI raises some of the same fundamental questions
00:24:31
that nukes did. How should they be used?
00:24:33
By whom? Under what rules?
00:24:35
Only this time we are creating a technology that can become
00:24:37
smarter than humans who are designing it.
00:24:40
That is Senator Alyssa Slotkin speaking to the Council of
00:24:45
Foreign Relations in New York, September 5th.
00:24:48
What's up with that one, Madeleine?
00:24:49
I I This was a kind of an eyebrow raiser.
00:24:52
Yeah, so I thought that, you know, a sitting US senator
00:24:56
bringing comparing AI to nuclear power was a pretty, you know,
00:25:01
bold statement, in part because I feel like all of the AICEOS
00:25:05
who are boosting their product and proliferating it on all of
00:25:07
us always make this comparison. So to have, you know, a sitting
00:25:11
US senator give this speech towards the Council of Foreign
00:25:14
Relations saying, Oh, yes, it's just as important as nukes when
00:25:18
you take it this seriously, that's, yes, it's a serious
00:25:21
issue. But also, it shows that the tech
00:25:23
industry's narrative is, you know, taken in by the US Senate.
00:25:27
I I do think Chachi BT flagged this one to me and I flagged it
00:25:30
to Madeline. So AI, AI is influencing.
00:25:33
AI says it's. Important here.
00:25:34
What's, what's interesting to me in some ways is that, you know,
00:25:39
2025 we've seen much less of the like AI supremacy AGI
00:25:44
conversation. Like I think last year we would
00:25:47
have been talking about that much more.
00:25:48
But, you know, I, I think we've maybe overcorrected and I think
00:25:52
she's right that, you know, AI has these huge existential
00:25:55
issues and a key part of it is that it keeps getting smarter
00:25:58
and is an intelligent being. I think we have another one that
00:26:02
sort of speaks to that. Yeah, I think the regulation of
00:26:06
AI is important to talk about. I think discussing a technology
00:26:09
that's getting smarter by leaps and bounds and it's going to
00:26:11
replace humanity because of its incredible intelligence is just
00:26:15
not accurate to the way this technology has progressed over
00:26:18
the last couple of years. I think it's gotten
00:26:19
incrementally more efficient and maybe more utility.
00:26:23
But to talk about this thing as like this incredible growing
00:26:26
force that we can't control plays into the marketing angle
00:26:28
of AI. There's a difference between the
00:26:30
complete, you know, anti AI stance.
00:26:33
This is all useless. It provides nothing and the
00:26:35
like. We have created the next Skynet.
00:26:37
And I do worry sometimes the amount of discussion that gets
00:26:42
committed to like what do we do to prevent Skynet completely
00:26:45
overrates the progress this technology has had in the cut in
00:26:48
the in the past couple of years. And I'm down to have these
00:26:50
discussions. But let's base it in the reality
00:26:52
of what it can do and what it can't do.
00:26:54
And what it can't do is actually think and do anything close to
00:26:57
what we think a true or like an intelligence could could do so.
00:27:01
I'm, I'm closer to the, I'm not. I mean, I, I think AI is not
00:27:05
positive, but it's certain. I think it's more capable than
00:27:08
you're describing. Newcomers quote #9 for 2025.
00:27:13
Everything I do is recommended by ChatGPT and then I check with
00:27:16
my doctors for safety. I always ask them if they
00:27:18
disagree and if they disagree with ChatGPT.
00:27:20
I ask another doctor, Jesus that is Vinod Khosla, on how much he
00:27:26
uses AI for personal health decisions at the Deus Ex
00:27:29
Medicina Summit. Another newcomer hosted Co
00:27:33
hosted event. This was in September.
00:27:35
Yeah, I felt like, you know, Vinod was landed on thick there
00:27:39
for how much he's using Chachi PT for health benefit.
00:27:43
And that was, of course, the theme of the conference that
00:27:46
day. But, you know, early investor in
00:27:48
open AI, he's talking about open AI.
00:27:50
It also has become a massive use case for open AI for people to
00:27:54
check it for healthcare questions.
00:27:55
And these startups like Open Evidence that are going
00:27:58
gangbusters right now, raising tons and tons of new funding,
00:28:01
seem to be basically ChatGPT for doctors.
00:28:03
So while I see the benefit of that, you know, it wades into
00:28:06
this whole question of how much should we be using AI for
00:28:09
healthcare? But that is clearly a use case
00:28:11
that has evolved this year. Yeah, I think doctors, we should
00:28:14
have some on the show at some point.
00:28:16
I'm interested to see how it's changed their job, not in like
00:28:18
in how they diagnose patients, but.
00:28:20
How they other good doctor friends we can have on.
00:28:23
Yeah, like how they deal with like questions from from
00:28:25
patients, you know, 'cause like, I'm sure like the advent of Web
00:28:28
MD, which is, you know, like the hackiest joke from the late 90s
00:28:33
of like, I went on the Internet and found I had cancer kind of
00:28:36
thing. But I bet that did really change
00:28:38
the medical profession in how you interact with patients, and
00:28:42
this has probably only accelerated it.
00:28:43
Well, I think patients come at the doctor's appointment being
00:28:46
like, I put my symptoms in a chachi PT and it said this,
00:28:49
this, this. So I think I have this and the
00:28:51
doctor has to, you know, respond in some way.
00:28:53
But maybe the doctor's also using chachi PT to check the
00:28:55
symptoms as well. So it's an interesting chachi PT
00:28:59
to chachi PT happening in the exam room.
00:29:02
The yeah, I mean, it's a first. Having a baby.
00:29:06
ChatGPT is the first line of defence in Is this medical thing
00:29:10
normal or not? There's a brief second where we
00:29:14
were worried that open AI was going to crack down on health
00:29:19
stuff, but I think it was a sort of false rumor.
00:29:22
And yeah, like you said, I think it's the use case they're really
00:29:25
leaning into. I have problems with the AI
00:29:28
industry talking about ChatGPT diagnostics as a revolution, as
00:29:34
something that's going to truly change the way people receive
00:29:38
medical care. I think it's such a poor
00:29:41
comparison a a poor stand in for actual medical care and like a a
00:29:46
professional. I know they do talk about this a
00:29:48
lot, but like we have major issues with the way people
00:29:51
receive medical treatment in this country and access to
00:29:54
healthcare and open AI and ChatGPT is it's like a drop in
00:29:58
the bucket of improvement report.
00:30:02
What actually needs to happen to give people better quality of
00:30:04
life. So I.
00:30:05
Well, a lot of what we need is humanity.
00:30:07
I mean, you know, you go to a hospital, you know, obviously my
00:30:10
wife gave birth this year and nurse to nurse performances
00:30:15
feels much more about bedside manner than medical knowledge.
00:30:20
And in some ways, I hope Chachi Petiti sort of makes that even
00:30:23
clearer. It's like, OK, if we have easy
00:30:26
to access resources for the actual medical knowledge, then
00:30:30
we're picking and elevating people based on, you know, their
00:30:34
bedside manner more and more than necessarily how much they
00:30:39
can retain medical. Knowledge.
00:30:41
I'm interested in in how AIS will change people's approach to
00:30:47
bedside manner and if it's going to cause people to become more
00:30:50
argumentative with their nurses and doctors because these AIS
00:30:55
are trained to please you and to back up your questions, confirm,
00:31:00
like confirm your suspicions on things.
00:31:03
And so if a nurse or a doctor is telling you one thing and you
00:31:05
put it into ChatGPT and it's like, no, you're right.
00:31:07
You know, don't trust the medical professional industry.
00:31:11
You know, you should be willing to question professionals when
00:31:14
it comes to your health, but it sounds like it's just going to
00:31:17
create more antagonism between people and their and their
00:31:21
doctors. But doctor, doctors, I think are
00:31:22
embracing it more, more than you seem to think.
00:31:25
Like I, I don't know. We'll see.
00:31:26
We should do an episode on that quote #10.
00:31:29
If you were thirteen years old, you should spend all your time
00:31:31
vibe coding. That's you should live your
00:31:32
life. That's Alexander Wong on TVPN in
00:31:36
September. Obviously he sold his company
00:31:39
scale this year to refer a bazillion dollars or.
00:31:43
Yeah, obviously he, he was around, you know, a $13 billion
00:31:46
exit right there, so good for him.
00:31:48
But yeah, well, I picked this quote in part because 1 vibe
00:31:52
coding obviously took off quite a bit this year.
00:31:54
We've all tinkered with Lovable and Cursor and the various vibe
00:31:59
coding tools on this show and, and the newsletter.
00:32:02
And also I think it's very funny to hear Alexander Wang's
00:32:06
dictating how 13 year olds will live their life in the future
00:32:09
from now on. I think there's a good Silicon
00:32:11
Valley culture of like, you know, we got to tell the youth
00:32:14
how they're going to change their entire education always.
00:32:17
And now it's vibe coding so. Is he a college dropout?
00:32:20
He has that college dropout profile.
00:32:23
Right. I think, I don't know if he's a
00:32:25
college dropout. I think he might have graduated
00:32:26
from college when he was 19. I think he's some sort.
00:32:29
Of he was a. Prodigy.
00:32:31
Prodigy. Yeah, I think he's a prodigy.
00:32:32
He's a math. He was a math prodigy.
00:32:36
So yeah, I think what he did at 13 is not going to be what
00:32:39
everyone does. Not your typical 13 year old.
00:32:43
He did drop out. MIT looks like.
00:32:47
Yes, he hasn't. LinkedIn grade 5 point O
00:32:50
dropout. Yeah, the dream.
00:32:54
The dream. So kind of a resume right there,
00:32:56
yeah. I think vibe coding is TBD as to
00:33:01
its actual effectiveness. There have been a lot of
00:33:03
interesting trend pieces about the vibe coding scene in San
00:33:07
Francisco, specifically in the whole startup culture built
00:33:10
around people that have no real technical skills but have
00:33:13
created their app entirely through vibe coding.
00:33:16
I guess I defer to you guys a little, but like I don't know
00:33:18
what's the last significant startup that has raised money
00:33:22
that was built on top of vibe Coded app?
00:33:24
I don't think there have been any of they all seem kind of
00:33:28
lame and I had fun vibe coding something that I used this year
00:33:33
like it's a oh, I built a this people that make fun of me about
00:33:36
this. I, when I go to the gym, I
00:33:39
always forget what locker I put my shit into.
00:33:42
And so I, I built an app through Chachi PT that it, it kind of
00:33:47
like used object like OCR to recognize the number.
00:33:50
So I would like hold it up to the number of the locker and it
00:33:52
would store the number. And so then at the end of my
00:33:54
workout, I would look at it and it would have the number in
00:33:56
there. The obvious answer to This is
00:33:58
why can't you just take a picture with your phone and look
00:33:59
at that? I was going to say I love the
00:34:01
Notes app. That's a great app that also
00:34:04
exists. You can write it down.
00:34:05
You have to write it down. I wanted an app that I could
00:34:07
just hold it up to the number and look.
00:34:09
I wanted a vibe, code, something and that was the idea.
00:34:11
And you want a purpose. It's nice to have a purpose
00:34:13
built app, you know. Yeah, I also learned that when
00:34:18
you vibe code something, it only lasts on your phone for 30 days
00:34:22
through through test pilot the the iPhone kind of beta testing
00:34:26
service. So it's it's now gone.
00:34:29
Locker buddy is is not a usable app.
00:34:33
RIP but. Yeah, I think I can somehow re
00:34:36
upload it there, but I, I am actually pretty skeptical of
00:34:40
vibe coding as something with real utility.
00:34:42
The fact that you can do it doesn't mean it has that much
00:34:45
value. And so beyond just like little
00:34:48
home projects like what I did, I don't really know.
00:34:50
This is quite the revolution that a lot of people are are
00:34:54
wanting it to be. Tom, do you still think we
00:34:57
should all learn how to code? All of us.
00:35:02
Well, you know, they're saying you don't ever have to learn how
00:35:03
to code anymore. And the the refrain was always
00:35:05
learn to code, learn to code. But now we have vibe coding.
00:35:08
The fact that we can code now in plain language, in plain
00:35:12
English, and that, you know, obviously there are limitations,
00:35:15
but it's getting better and better.
00:35:16
The fact that people who are themselves genius coders and
00:35:20
math geniuses think that you should just learn to code in
00:35:23
English, like that's hugely significant.
00:35:26
You know, I like raising my daughter.
00:35:28
It's like, OK, I'm spending a lot of time right now trying to
00:35:31
Babble with her, eventually teach her English.
00:35:33
You skip the step where you have to go learn some code based
00:35:36
language to interact with computers.
00:35:38
You can just use the English you learned, and all of a sudden,
00:35:42
you know, the humanities majors among us are going to be
00:35:45
empowered to build whatever we want on the Internet.
00:35:49
And that's validated by the math geeks this year.
00:35:52
Like, that's huge. I think it's landmark.
00:35:54
I think we're way underestimating how much it's
00:35:56
going to be a boon to the word cells among us.
00:35:59
I believe that. You know you did it.
00:36:02
You built the app. But what value did it truly add?
00:36:05
It was a fun little project. It's like I think everyone
00:36:08
should learn how to like build your.
00:36:09
Own airplanes. You didn't have enough ambition.
00:36:10
It was sort of a silly little project, but like, yeah.
00:36:13
Well, I don't know if like vibe coding allows you to like have
00:36:16
much more ambition than what I did.
00:36:18
Right. But we're near like one or two
00:36:21
of vibe coding. We're in the beginning.
00:36:23
Like, this is where tech people dominate the rest of us.
00:36:26
You see like, oh, it's it's just OK.
00:36:29
It's like, yeah, because there's a rate of growth because they're
00:36:32
they're just starting. Like we're just this is just
00:36:35
happening. This is all so fresh.
00:36:37
But what are what are you going to really build?
00:36:38
What what would you do? You have the ability to vibe
00:36:40
code right now? Like, what could you build in
00:36:42
your life that would change it in a way that you're like, Oh my
00:36:45
God, everyone fucking needs to. Vibe code things around the
00:36:48
newcomer business that I would like to build.
00:36:50
We are literally talking to some of the vibe coding companies
00:36:52
about trying to build stuff. But why wouldn't you just hire a
00:36:56
developer to build something for you?
00:36:57
Well, somebody like us, we're not gonna hire a necessarily a
00:37:00
great developer. It's not our expertise.
00:37:01
I don't know how to manage a developer, you know, And some of
00:37:05
the things we want, you know, it's like the beauty of vibe
00:37:07
coding, right? And This is why we had, you
00:37:09
know, Dylan Field at Figma on stage at one of our Cerebral
00:37:14
Valley events in London. You know, it's just like you get
00:37:18
to think when when you write a piece, you think about the ideas
00:37:22
and it makes your writing better, right?
00:37:24
When you're coding the thing, you're building the thing and
00:37:27
you start to understand what goes into it, what trade-offs
00:37:30
have to be made. And so I think giving the person
00:37:33
with the idea for the product, the ability to do the actual
00:37:37
thing without having to know how to code just allows you to sort
00:37:40
of think through it and it, yeah, it just empowers.
00:37:43
That you can ultimately tell a developer what you want when you
00:37:45
want, like an actual functional. High quality it's best for like
00:37:48
prototyping right now. I I agree, but like, yeah, I
00:37:51
think it's the trend it's. The worst it's ever gonna be.
00:37:53
I do think that's true to your. Point.
00:37:55
Yep. Yeah, Yeah.
00:37:57
I also think. I'll hang up on you on this one,
00:37:58
Tom. I think it's pretty big.
00:38:01
Let's circle back to this one at the end of 26.
00:38:04
We'll see where that coding is going.
00:38:05
Sounds good. I'm I'm betting the under.
00:38:07
All right, Quote 11. But make no mistake, what we are
00:38:10
dealing with is a real and mysterious creature, not a
00:38:13
simple and predictable machine. This is Jack Clark on the
00:38:16
significance of artificial intelligence development during
00:38:18
a speech at the Curve conference in Berkeley.
00:38:20
And that was in October Your your former colleague at.
00:38:24
Bloomberg Anthropic Co founder and my former colleague like
00:38:27
Jack. Jack and I have hung out, and he
00:38:30
is. We used to go biking.
00:38:31
Together tremendously. We biked on.
00:38:33
I went to like a punk. Rock Backyard show with him?
00:38:35
Yeah. Wait, no.
00:38:36
Way. I didn't know this, yeah.
00:38:38
You just to hang guy. He covered Google the most normy
00:38:43
beat. It was a little too wonky.
00:38:46
It was like, Jack, this is Bloomberg.
00:38:47
Like, worry about money, less about tech.
00:38:49
And you like, really love technology.
00:38:51
And then he goes off to open AI and, yeah, he turns out you make
00:38:55
a lot more money in Silicon Valley thinking about tech than
00:38:58
thinking about money. So.
00:39:00
I remember when he quit Bloomberg to go to open AI and
00:39:03
we were all like, well that's stupid.
00:39:04
Why would you leave a great job like Bloomberg which you ended
00:39:07
up doing? Yeah, I I didn't think it was
00:39:11
stupid. I just, you know, it was in the
00:39:13
reporter kool-aid cult that you spend your life.
00:39:16
You know, it's very hard to get into reporting.
00:39:17
It's hard to give it up. But kudos to Jack and Jack, you
00:39:21
know, runs a very success. He writes a very successful
00:39:23
newsletter import AI. So somehow he's able to Co found
00:39:28
anthropic and do the newsletter hustle and be a top thinker on
00:39:33
AI policy. Anyway, Madeline, we've talked
00:39:35
about Jack, but what what do you what stood out to you about the
00:39:37
quote itself? Right.
00:39:38
Well, I felt like 1, he was fully anthropomorphizing the AI,
00:39:42
which in this episode alone we've debated if that's legit or
00:39:45
not. And it also felt really
00:39:47
representative of the anthropic take.
00:39:50
You know what I mean? Like every, all of the
00:39:52
foundation model companies have sort of their stance on how much
00:39:55
we should regulate this or not regulate this or limit the
00:39:59
contents of the speech or be concerned if it's spewing
00:40:02
dangerous ideas or what the power of the great machine is.
00:40:05
And yes, AGI fears have fallen somewhat.
00:40:08
Out of the conversation, but Anthropic is still doing it and
00:40:11
they're still very concerned about safety.
00:40:13
So they're that was sort of the signpost of Anthropic posture.
00:40:17
I felt like as a quote that we should put in the list.
00:40:21
Yeah, I feel like it fits in with the Slotkin point of view,
00:40:23
which is, yes, 2025 is the year where Dumerisms out of favor and
00:40:28
Tom's view is ascendant. But there are still holdouts who
00:40:32
think this this is a thing heading to general intelligence.
00:40:36
It is a real and mysterious creature.
00:40:39
I I sort of agree with Jack. I what I actually take away from
00:40:44
this is less like we I've already started my piece about,
00:40:47
you know, the anthropomorphization of
00:40:48
technology. I am really interested in like
00:40:50
the separation of Anthropic from the rest of the AI field right
00:40:54
now, just as a like kudos to Dario for he's just a very
00:40:59
earnest guy. Like he is stuck by his guns as
00:41:02
I'm very worried about AII want to talk about regulation.
00:41:06
He has been caught in the crosshairs with with David Sachs
00:41:10
and the Trump administration. He he truly does not like Trump,
00:41:13
which, you know, causes him potential business concerns.
00:41:18
And I think, you know, going out of this year, Anthropic looks
00:41:21
like the better bet as a company right now just in terms of its
00:41:25
financial stability. It's aim is very true.
00:41:28
It's not trying to do a, you know, a million different things
00:41:31
and, you know, a lot can change. I saw Elon, you know, shitting
00:41:35
on Anthropic at some point during the year saying like
00:41:37
Anthropic winning was never in the cards or something like
00:41:40
that. And they might go public next
00:41:44
year, which is by no means success.
00:41:45
You know, that's not like an end point.
00:41:47
It's not like you've reached the end of your journey and you've
00:41:49
won. But like, they're building
00:41:51
something that feels a lot more reasonable right now.
00:41:54
Well, it's also funny Elon saying that when he's running,
00:41:56
you know, XAI as well as if he's just, you know, commenting on
00:41:59
this like he doesn't have a direct competitor stake.
00:42:01
Yeah, we live in the world where you pick your philosopher king,
00:42:04
and I think everybody on this podcast would prefer Dario to
00:42:09
Elon Musk and Sam Ullman, so. Plus, we all agree that Claude
00:42:12
is the better writing partner, I have to say.
00:42:15
Yes, you know, although it's. Starting to get a little sick of
00:42:17
antics man. The man behind the philosopher
00:42:20
king. So all right quote #12.
00:42:24
Overall, the models are not there.
00:42:26
I feel like the industry is making too big of a jump and is
00:42:29
trying to pretend like this is amazing and it's not.
00:42:31
It's slop. That is Andre Carpathy, my man,
00:42:35
during his interview on the Door Cash podcast, and that is from
00:42:38
October. Yeah.
00:42:40
So I think you can tell it worked out quite well that these
00:42:43
were chronologically next to each other because I did pair
00:42:45
these together if I was able to just throw them all on the page.
00:42:49
You know, we come from Anthropic, Oh my gosh, the
00:42:52
consequences of our actions. How will we live in the future
00:42:55
to where the models are right now is bad in a kind of that
00:43:00
kind of broke the Internet in tech world, at least when that
00:43:02
episode came out of having, you know, Andre Carpathy basically
00:43:05
say the quiet part out loud that these models are great.
00:43:09
There's things that they can't do as well as we'd like and
00:43:12
they're not. Are they God kings?
00:43:14
At this point, No. Yeah, I mean, this was the
00:43:15
interview that shook Silicon Valley.
00:43:17
Everybody takes what Carpathy has to say extremely seriously.
00:43:21
He is both technically capable and speaks in plain English.
00:43:25
And I, I don't think anybody is disagreeing with him directly.
00:43:30
You know, he's a little bit more cautious, maybe Dario, but he's,
00:43:33
you know, it was more sanguine about the rate of progress than
00:43:37
I think Silicon Valley had been up until that point.
00:43:40
Yeah, this is I think the first time we've mentioned it so far,
00:43:42
but the the the word slop, which was the actual word of the year,
00:43:45
although I like your sycophancy as another choice or word of the
00:43:48
year, Eric, I think slop is a fascinating concept.
00:43:51
It gets misapplied a lot. People are just calling
00:43:53
everything they don't like as slop, which is unfortunate
00:43:57
because I think it's a very specific thing, which is content
00:44:00
that is unmoored to general taste or, or, or interest and it
00:44:04
just sort of gets produced and, you know, thrown out into the
00:44:07
world. But AI will have to at some
00:44:11
point reckon the industry will have to reckon with the
00:44:14
prevalence of slop that has come out of these tools.
00:44:17
I think Mehta's launch of Vibes personifies the slop.
00:44:21
Is that a fair statement? Like when Vibes came out when
00:44:24
they were rushing to meet Sora, that to me was the slop of 2025.
00:44:30
Just a fully scrollable AI generated content app with no
00:44:34
taste discernment, no rhyme or reason, just scrolling through
00:44:37
the endless feed. That to me more so than just,
00:44:41
you know, slop being something I don't like.
00:44:43
To your point, Tom, that was like the the definition of what
00:44:46
really we're we're talking about.
00:44:48
Here, Yeah. And is it a replacement for
00:44:50
entertainment? Is it going to get, you know,
00:44:51
infused into, quote UN quote, mainstream entertainment?
00:44:54
Or will it exist as this parallel industry?
00:44:57
The idea of someone just tuning out and like, scrolling through
00:45:00
slop is very nihilistic to me. I I, I, and I'm not even saying
00:45:04
that like I believe AI can have many positive effects on
00:45:09
entertainment, but slop, if slop feed even Sora, the idea that
00:45:14
someone could like go through Sora as a social media tool, not
00:45:17
just a creation tool, but like just checking out other people's
00:45:20
like create. That's weird.
00:45:22
That's really weird to me. I I don't love that idea.
00:45:25
Sycophancy and Slop 2025. All right.
00:45:29
Quote 13 for our top quotes in 2025.
00:45:33
Having listened closely to my fellow San Franciscans and our
00:45:36
local officials, and after the largest and safest Dream Force
00:45:39
in our history, I do not believe the National Guard is needed to
00:45:43
address safety in San Francisco. My earlier comment came from an
00:45:47
abundance of caution around the event, and I sincerely apologize
00:45:50
for the concern it caused. That was Marc Benioff on X after
00:45:55
asking Trump to send the National Guard into San
00:45:57
Francisco ahead of Dreamforce. This is someone TomTom has
00:46:01
written a profile of Marc Benioff in which Benioff Sikhed
00:46:05
the Governor of California, Gavin Newsom and Yo-yo Ma on Tom
00:46:09
to keep the profile beat. So I'm sure Tom will have lots
00:46:13
to say about Benioff. But Madeline, you wanna take a
00:46:16
first crack at why we picked this one?
00:46:18
Yes, so many will ask why did I not pick the send the National
00:46:23
Guard quote in and of itself fairpoint.
00:46:25
However, I felt like the five day, like maybe not five days,
00:46:30
but super quick apology to the mass backlash was such a like
00:46:35
tail between your legs moment and it really personified to me.
00:46:39
Also, as we've talked about, the techs move to the right, how
00:46:43
quickly they'll easily swing back when they realize that it's
00:46:47
not working for them. Yeah, No, it was, it was, it was
00:46:50
astounding that he he did this, that he like he called for Marc
00:46:54
Benioff, you know, the good loyal Democrat, the one who was
00:46:57
to the left, a lot of Silicon Valley Democrats calling for
00:47:01
like more taxes in San Francisco.
00:47:04
And then, you know, then he calls for the National Guard.
00:47:08
It just seemed totally, you know.
00:47:11
He called once for a boycott of Indiana because of its anti
00:47:14
abortion law. He threatened to pull their
00:47:17
employees in Indiana out of the state because he was so offended
00:47:21
by a. Right.
00:47:22
He was like the moralizer in chief.
00:47:25
I mean, he was, he embraced sort of wokeism, right?
00:47:28
I mean, he was yeah, definitely pro like sort of the push to
00:47:31
DEI. I I mean, the business case is
00:47:37
just Salesforce is suffering and could use a a boost.
00:47:42
And, you know, maybe he's like, I need some of this Trump
00:47:44
favoritism I hear so much about, but I don't know.
00:47:47
Tom, get in Marc Benioff's head for a second here.
00:47:50
I mean, I think, I think I, I always used to use Benioff and
00:47:54
Ron Conway as examples of when tech people get involved in
00:47:57
politics and actually know what they're doing because they tend
00:48:00
to win. You know, Benioff supported Prop
00:48:03
C and it won. Whether or not that's been good
00:48:05
or enough for the city is a different conversation.
00:48:07
But he like, he backs winners. He knows when to get involved.
00:48:10
In stuff he claims. Yeah, yeah, on businesses, which
00:48:13
was, you know, to help fund homelessness initiatives.
00:48:16
And he spends like he's a big philanthropist and all this
00:48:18
stuff. You know, Ron Conway essentially
00:48:20
picks the mayor in San Francisco every every time there's an
00:48:23
election. And you saw a split between them
00:48:26
after this comment, after his like National Guard comments to
00:48:29
a point where Ron Conway, I don't think he's really buddies
00:48:31
with him anymore. And, and like, you know, stepped
00:48:33
down from the board of of of one of his foundations and.
00:48:36
Kudos to Ron Conway, the only tech guy with the backbone right
00:48:39
now. Yeah, in San Francisco and and
00:48:42
Dario, I think. We can, Betty.
00:48:46
Office Newcomer's good list, you know.
00:48:48
Yeah, we've got Ron. Nice list of bauddy list.
00:48:50
That would have been fun. Yeah, I think Benioff has got a
00:48:54
lot of problems to deal with, you know, as a business right
00:48:58
now. I mean when I think he's.
00:49:00
Just out of touch, like he didn't know like what's wrong
00:49:02
with him? Like I guess he.
00:49:04
Just seems to not be able to read the moment.
00:49:06
Or maybe he's reading it better than anyone else and things will
00:49:08
come back around. But I thought you saw between
00:49:10
the National Guard comments and then he, you know, went to a
00:49:16
Trump state dinner, which is fine, a White House dinner, like
00:49:18
it's important for these guys to to go there.
00:49:20
But he was like taking selfies with Pam Bondi on it and being
00:49:24
like, hey, the fun kids just showed up.
00:49:25
And it's like, no one likes Pam Bondi, dude.
00:49:28
Like, like, I don't even think Trump likes her right now,
00:49:31
right? Like.
00:49:32
The political instincts are way off right now.
00:49:34
The only thing that's more often Benioff's political instincts is
00:49:37
Salesforce's stock price. But.
00:49:41
Benioff was was he on the podcast this year, by the way?
00:49:44
Our podcast. Yeah.
00:49:46
Well, things. And at the end of that podcast,
00:49:49
if you stuck around the end, I still saw him like Ron Conway,
00:49:52
where I was like, what's going on?
00:49:53
Like you won't stand up to these guys.
00:49:55
And there were early hints because I was like, you don't
00:49:57
have more of a backbone than this.
00:49:59
You've been such a loud, like Democrat.
00:50:01
And he was pretty cowardly. Like I was definitely
00:50:03
disappointed on on that episode. But yeah, he's in a weird.
00:50:09
I think people lose touch at a certain point.
00:50:11
Even if you have a great feel for the, you know, the instincts
00:50:14
and the politics, at some point you just, the, the, the scene
00:50:17
moves on without you and you start making these kinds of
00:50:19
mistakes. And Benioff had so many of them
00:50:21
this year. I think the general billionaire
00:50:24
world and successful business world did not clock the shift
00:50:30
rapidly against Trump. That happened as quickly as it
00:50:34
has. And I think that they thought
00:50:35
that they maybe had a couple more years of this before the
00:50:37
pendulum swung. But the economic fallout and
00:50:40
people don't like the immigration crackdowns and
00:50:43
basically public opinion has shifted remarkably quickly away
00:50:47
from Trump in the last few months.
00:50:49
So I think a lot of people were caught a little off.
00:50:51
Guard, We're going to get real fascism, and we haven't gotten
00:50:54
the memo yet. And these guys.
00:50:57
Yeah, that's what I'm saying. He might have better instincts
00:50:58
than all of us. All right quote #14 for 2025 A.
00:51:02
Rejection of any kind. I should like shift in my chair
00:51:05
a lot as I'm as I'm reading this.
00:51:07
Yeah. Do you have any Adderall over
00:51:08
there or? Yeah.
00:51:09
Is that what it is? OK.
00:51:11
You could do a push up. Adderall I think I won't get in
00:51:14
trouble for joking about I don't know which what it is.
00:51:17
My son is is neuroatypical, so I feel OK calling him out for
00:51:21
this. A rejection of any kind of
00:51:23
shared and defined sense of common culture in this nation
00:51:26
and others has significant costs.
00:51:28
We should, indeed must return to a shared national experience, an
00:51:31
embrace of common identity that, by definition, puts forward
00:51:34
certain ideas, values, cultures and ways of living at the
00:51:38
exclusion of others. That's Alex Karp in his Palantir
00:51:42
SQ3 letter to shareholders from November.
00:51:45
Not Deal Book where you know he we're we're referencing the
00:51:49
crazy physical antics of Alex Karp at Deal Book, though I
00:51:54
don't know if the actual prose was this.
00:51:56
Good quote. The prose from the deal book
00:51:59
moment. If we could include a video clip
00:52:02
in this post, I would have included a video.
00:52:04
However, I wanted to get the sentiment across of the twitchy
00:52:08
conversation. Yeah, I think Alex Karp has
00:52:11
been, you know, a really prominent CEO this year in tech.
00:52:14
Obviously, Palantir stock has gone up quite a bit and down and
00:52:18
fluctuated, but you know, it's been a stock to watch and his,
00:52:22
you know, embrace of the the West and the Trump
00:52:25
administration and kind of defense tech as a larger theme.
00:52:30
The West in the as a synonym for white people or I don't know,
00:52:34
the West is yeah. What does the West mean?
00:52:37
How they say it's like Judeo-christian values.
00:52:39
You know it's. A catch all for all that, you
00:52:41
know, and how best we should do business and live our lives.
00:52:45
And his sort of embrace of that kind of went mainstream and it
00:52:48
was enough that he put it, you know, in his quarterly letter to
00:52:51
shareholders that like this is the philosophy we are going to
00:52:53
follow at Palantir. And I think it's a philosophy
00:52:56
that guides a lot of these defense tech companies that are
00:52:59
have propped up that are all kind of children of Palantir in
00:53:02
a way, or children of Anderill, which is also came out of
00:53:04
Palantir with a lot of ex guys there.
00:53:07
So felt like this was a way to capture the defense tech moment
00:53:10
and maybe we can, you know, embed a little twitchiness
00:53:15
somehow when. This this I take great umbrage
00:53:19
embrace of common identity that by definition puts forward
00:53:23
certain ideas values like I'm just, you know, the great thing
00:53:29
about being a country organized around the Declaration of
00:53:32
Independence and Enlightenment ideas is that we're organized
00:53:36
around ideas directly. Like I, I, I feel like they're
00:53:40
trying to sort of they, yeah, they're the, you know, there's a
00:53:44
lot of identitarianism obviously on the rise this year, and this
00:53:49
seems to try and have it. It's.
00:53:51
Trying to have it both ways. It is.
00:53:53
It's trying to say, you know, yes, we're rah, rah America.
00:53:57
But if you're being rah, rah America, you have to embrace,
00:53:59
you know, American values of being a country founded on
00:54:02
ideas, not identity and, you know, nationality and race for
00:54:08
that narrative. And he's also trying to do it
00:54:12
pretty heavily. But yeah, our identity is what
00:54:14
matters, and that's the right one.
00:54:15
So it felt like, you know, really captured the spirit of
00:54:18
those guys. Yeah, Karp's a weird 1 though,
00:54:20
too. He doesn't really fit the molds
00:54:21
of a lot of these people. I mean the dude voted for Kamala
00:54:25
and so he's also like they. Always bring him out as the like
00:54:28
token Democrat defence tech guy at all these.
00:54:31
Events, yeah. Like he's kind of a Lib.
00:54:33
He's just sort of all over the place.
00:54:34
I never know what to to draw from him.
00:54:36
I like the idea of him as like an avatar of kind of the defence
00:54:40
tech moment too, because Palmer Luckey is the same sort of
00:54:42
thing. He's kind of hard to categorize,
00:54:45
but these guys matter. But Palmer is much more socially
00:54:49
of the maca world than yes. Well, I mean, Carp is close to
00:54:52
Peter Teal, so I mean, they're all I mean, it's.
00:54:54
Complicated. They're all complicated.
00:54:57
I don't have a great read on Alex Carp.
00:54:59
I used to go to Sun Valley, the big media tech.
00:55:03
Not as a man with a badge, as a reporter clinging to the edges.
00:55:08
Yeah, I would sit at a morsel. Please, please, CEO, can you
00:55:12
give me a scoop about what coffee order you're making right
00:55:16
now? Yeah, yeah, yeah.
00:55:18
It's it's like. That's not even a people, right?
00:55:20
It's like Tim Cook had a coffee with Mark Zuckerberg scoop, you
00:55:24
know? No, no, it was Tim.
00:55:26
Tim. Tim Apple was at the coffee shop
00:55:28
outside Sun Valley and he tried to pay with Apple Pay and they
00:55:31
didn't accept it. And we wrote a little item about
00:55:34
that for the information. It got picked up by Business
00:55:36
Insider. Oh, that's.
00:55:38
A great wheel of content. Glad you got aggregated there.
00:55:41
Yeah, it's it it really one of the true humiliating experiences
00:55:44
of my life is going to Sun Valley.
00:55:46
But I would see Alex Karp walking around, you know, in
00:55:49
between the the lodge and the duck pond.
00:55:52
And the amount of nervous energy emanating from this guy could
00:55:55
power a fucking data center. That guy is.
00:55:58
There's a lot. There's a lot going on there.
00:56:01
Let's move on. All right quote #15.
00:56:04
I like this one. This is where we're looking for
00:56:06
an ecosystem of banks, private equity, maybe even governmental,
00:56:09
the way governments can come to bear meaning like just first of
00:56:12
all, the backstop, the guarantee that allows the financing to
00:56:15
happen. So that was Sarah Fryer, of
00:56:17
course, open AICFO and whether open AI would need a federal
00:56:20
backstop against its spending. And that was at The Journal,
00:56:23
Wall Street Journal's tech live event in Napa.
00:56:26
And that was in November. Great, great pick, Backstop.
00:56:30
That's the backstop. Another word of the year,
00:56:32
honestly. Backstop really rang around
00:56:34
Twitter. I mean, that one speaks for
00:56:35
itself, you know, open AI CFO calling for a federal backstop
00:56:40
in words that she quickly walked back after this live event.
00:56:44
But that's the power of live journalism and why we have a
00:56:46
thriving events business, because you you say stuff on
00:56:49
stage that maybe reveals what you're really thinking about.
00:56:52
There was a lot of hand wringing from Open Eye after that where
00:56:54
they were like, that's not what we meant.
00:56:55
My words were taken kind of context.
00:56:57
I was just agreeing broadly with the premise of this.
00:57:00
And they had to, you know, announce that they never wanted
00:57:02
to do it. And then it turned out there was
00:57:04
some letter that Open Eye had actually sent to the
00:57:06
administration in which they had floated this.
00:57:09
I don't have the reporter to back this up.
00:57:10
This is purely based on intuition.
00:57:12
They fucking want a federal backstop.
00:57:14
There is no question in my mind that Open AI wants to be sure
00:57:18
that when they are start to take out enormous loans to pay for
00:57:21
their compute needs that they want to be sure that they will
00:57:24
not default as a company. And they got called out for it.
00:57:27
They kind of got a little too close to the sun.
00:57:29
They saw everyone on All You Know from.
00:57:31
Left to right had a policy memo document that floated this idea.
00:57:35
This was not an errant thing, she said.
00:57:38
And it really feels like if they'd done it more artfully,
00:57:41
they might have gotten away with it.
00:57:43
Like, it's not totally outside of reason that they could
00:57:47
convince the Trump administration to say, hey, it's
00:57:50
very strategically important that we have an investment.
00:57:53
We're going to guarantee loans to open AI.
00:57:55
There's no reason they'd fail. Like, But I mean, thankfully, I
00:57:59
think as we said at the top of this episode, there are enough
00:58:03
AI skeptics that that I think would be very politically
00:58:06
unpalatable. And anyone who believes in free
00:58:09
enterprise and risk and thinks bailouts to store the entire
00:58:14
economy and that capitalism means capitalism should be
00:58:17
against a fucking federal backstop.
00:58:18
Like that's the risk investors are meant to take.
00:58:21
They're betting on whether AI is safer, say, you know, a safe
00:58:24
financial investment or not. And that's the provenance of the
00:58:28
free market and investors, not the government.
00:58:30
So yeah, I'm happy to root against.
00:58:34
What venture capital is, it's inherently A risky investment
00:58:38
like that is, you know, where these companies are birthed
00:58:40
from. You know, that's kind of the
00:58:42
whole deal that you're taking a risk and it may not work out,
00:58:45
but you make it super rich otherwise.
00:58:46
So it's funny to see, you know, Open AI, the most valuable
00:58:50
venture backed private company in the world, might maybe is it
00:58:55
compared to SpaceX? Where are they?
00:58:56
Are they neck and neck right now?
00:58:57
But anyway, it's yeah, you know, we'll do our business in
00:59:00
private. It's not even public.
00:59:01
But yeah, we need a federal backstop.
00:59:03
But it's it's not even venture capital.
00:59:05
It's debt, like debt comes with risk with it.
00:59:08
Like that's the reason that people that take debt make money
00:59:11
from taking debt, taking on debt like buying up debt because it's
00:59:15
a risky endeavor. And so it was funny to see David
00:59:18
Sachs, you know, and in response to this whole thing being like,
00:59:20
the government will absolutely not be bailing out any AI
00:59:23
companies. There are lots of model
00:59:24
providers out there, which is an idiotic thing to say because the
00:59:29
whole point of being too big to fail means that like there are
00:59:32
wider implications to a failure. There's a fallout for it.
00:59:36
We said this on the show in stories on newcomer open eye
00:59:40
strategy of taking on these massive, you know, these massive
00:59:43
deals. He is a too big to fail strategy
00:59:46
that they are. You know, Sam Altman has been
00:59:48
intertwining himself into the broader economic environment
00:59:51
such that like if they were to default as a company, there
00:59:54
would be a pretty large, maybe not catastrophic, but a large
00:59:57
effect on on, you know, tech stocks more broadly.
01:00:01
And I don't know, I think he's engineered it.
01:00:04
This is where we're at right now, opening eyes wound into the
01:00:07
valuation of multiple publicly traded companies and.
01:00:10
If they were to collapse one day, it would have some sort of
01:00:13
larger economic fallout. OK, brief thing here.
01:00:18
It's like equity is much safer than debt, right?
01:00:22
You know, it's like most of the money open AI has taken on so
01:00:26
far is in the form of equity. So if Open AI value collapses,
01:00:31
Microsoft or somebody could buy it at a low price.
01:00:34
It doesn't mean it collapses to 0.
01:00:36
The real problem is if it has all this debt that you know,
01:00:39
outstrips its equity value, then you know, everybody could want
01:00:43
to avoid buying it because they would take on all this debt and
01:00:47
then then you have a huge problem.
01:00:49
So right now I don't think we have the level of debt that you
01:00:53
we have that sort of House of Cards.
01:00:55
But the debt exists at the peripheries.
01:00:57
Like Oracle, which is a major partner of theirs, has taken on
01:01:00
a lot of debt to pay for data centers.
01:01:01
Core Weave has taken on debt. Yeah, yeah, I mean all.
01:01:05
Those companies aren't gonna get saved.
01:01:07
That's why it's good that it's sort of externalized from from
01:01:10
OEI. So I I don't know that I see the
01:01:14
catastrophe you're painting. But Sam Altman has an insatiable
01:01:17
appetite for more and knows that, you know he's in the
01:01:22
business of scale, right. These foundation models seem
01:01:26
still to get smarter the more they spend on training runs, and
01:01:30
so he'll take as much money as he can get.
01:01:32
And if the government giving him a backstop to get cheap debt was
01:01:36
gonna be 1 path, he'd try it. But he's also happy to try
01:01:39
things that don't succeed and move on to the next one.
01:01:41
And so hopefully the backstop idea didn't take.
01:01:46
To Sarah Fryer, it never hurts to ask.
01:01:48
You never know until you ask. Exactly.
01:01:52
All right, Quote 16 for 2025. I think democracy's fucked if we
01:01:56
do this. That is from LA Adventurers.
01:01:58
Andy Konwinsky interviewed by me at the Cereal Valley AI Summit,
01:02:04
and this was about whether what would happen if the US seeds
01:02:08
open source on AI research advancement to China.
01:02:11
Yeah. So I mean, I picked that one in
01:02:13
part because I feel like the open source debate hadn't been
01:02:16
touched on as much in this list. But the the the increasing
01:02:20
competition with China around the AI race was a very major
01:02:23
theme of 2025. And they are obviously, maybe
01:02:26
they're cheating, but there's some evidence that suggests that
01:02:29
they're taking some stuff from the US But they are, you know,
01:02:32
kicking our ass an open source in a lot of ways.
01:02:34
So I think it's a fairpoint. Andy, you know, founder of
01:02:39
Perplexity and Databricks, very importantly so a serious guy and
01:02:44
has the beard to reflect his seriousness.
01:02:47
I don't know, Tom, a real you asked the question.
01:02:49
What was your take? The Rick Rubin of Tech?
01:02:53
Very difficult. From only in a set.
01:02:54
Way yeah, I think open source is a fascinating topic that was so
01:03:03
big at the beginning of the year because of DeepSeek and this
01:03:06
idea of like whether an open source model could actively
01:03:08
compete with, you know what the the proprietary, you know,
01:03:12
closed source models can do it. We've kind of forgotten about
01:03:15
it. China really does seem to be way
01:03:17
out ahead right now. Not even does seem to be they
01:03:20
are way out ahead. And there's a lot of open source
01:03:22
models that are being used by American companies that are all
01:03:25
made in China. And, you know, that is
01:03:27
concerning to a lot of people that dynamic.
01:03:29
And you're seeing a few companies coming up like
01:03:32
Reflection and maybe Meta is actually abandoning open source.
01:03:38
So that's kind of interesting. But I actually, I would place my
01:03:42
bet that 5 to 10 years from now that the most important models
01:03:45
probably will be open source. It just is the trend of that
01:03:48
kind of technology. So I guess if you're someone
01:03:51
like Andy, you, you better hope it's coming from an American.
01:03:54
What I don't understand is if it's open truly, then who cares
01:03:57
if it's Chinese? We can still have U.S. companies
01:04:00
that host 100% What what does it matter?
01:04:02
I mean, is he more interest? He's more worried about like the
01:04:05
capability and just that if you're producing the open source
01:04:08
research, you, you understand it better and therefore you're more
01:04:11
capable I. Mean it's hard to get into like
01:04:14
the geopolitics of it all, but I think it's he was equating open
01:04:17
source with democracy that like it's and that's a kind of core
01:04:20
belief of certain technologists is that technology should be
01:04:24
something that is democratically available and open source is the
01:04:27
only real way of getting there. And if we, you know, we believe
01:04:30
in a future entirely run by AI, the public should have the right
01:04:34
to, you know, be able to adjust it as they want and not pay a
01:04:38
price every single time you want to access this technology.
01:04:41
It's an interesting idea. Quote 17 Homestretch second to
01:04:46
last. We should have, like, corporate
01:04:48
music playing in the background, like those, like, training
01:04:50
videos, you know? Yeah, We are delighted by
01:04:55
Google's success. They've made great advances in
01:04:57
AI and we continue to supply to Google.
01:04:59
NVIDIA is a generation ahead of the industry.
01:05:02
It's the only platform that runs every AI model and does it
01:05:04
everywhere computing is done. NVIDIA offers greater
01:05:07
performance, versatility and fungibility than ASICS, which
01:05:10
are designed for specific AI frameworks or functions.
01:05:13
This is a tweet from Nvidia's newsroom after the release of
01:05:16
Google's Gemini 3, and it got a bunch of public praise.
01:05:20
And also I think it's specifically in response to a
01:05:23
story from the information that Meta was talking to Google about
01:05:27
purchasing a huge number of TP use from them.
01:05:30
Right, yes. So this I would call the quote
01:05:33
that personified the oh shit moment for a lot of people about
01:05:36
Gemini 3's launch and how successful the model was while
01:05:41
being trained on Google's in house TVU chips to where
01:05:44
everyone kind of realized Google has the full package under one
01:05:49
roof and we should not count out Google at this moment in the AI
01:05:53
race. They have top scientists, They
01:05:55
have a really strong high performing model.
01:05:57
They have a great image model they have of search engines
01:06:02
business that prints money for them to power all of this.
01:06:04
So they aren't necessarily as reliant on debt as maybe some
01:06:07
other people will be. And they have the chips
01:06:09
themselves. So that's kind of, you know, we
01:06:12
can debate if they're a monopoly or not since they kind of are
01:06:15
told they are, but it doesn't matter.
01:06:16
But they have an advantage here. Google stock is up 65% year to
01:06:22
date. It certainly has been not not a
01:06:25
surprise to us as you've heard on this podcast before.
01:06:28
I guess we should make more money betting on stocks we
01:06:31
believe in because we've been Tom came into the job betting
01:06:35
betting on Google or narratively betting on Google and
01:06:38
unfortunately stock is doing well.
01:06:40
But NVIDIA I I you know I think as Tom is going to get into this
01:06:44
is extreme defensiveness NPR from NVIDIA like doth protest
01:06:48
too much situation Yeah what are you we're.
01:06:51
Doing great. Our business is doing awesome.
01:06:53
And don't forget forget that you'll need a lot more than just
01:06:56
one type of chip. Exactly.
01:06:58
We're specialized. Yeah, I don't want to go into
01:07:01
the Lulu world of telling other people how to manage their calm
01:07:05
strategy, but sometimes not saying anything is the best
01:07:08
response to an event, and not even a particularly damaging one
01:07:12
like Nvidia's A4 fucking trillion dollar company.
01:07:15
You can handle like one or two bad days of trading because of a
01:07:19
story saying that like one company is going to be buying
01:07:22
your competitors chips. Like you guys are still
01:07:25
basically owning the whole AI chips game.
01:07:28
So maybe maybe don't talk too much about that.
01:07:30
But like in terms of Google, I have not actually messed around
01:07:34
with the Gemini text model that much.
01:07:36
I haven't, you know, seen it back and forth with that sort of
01:07:38
thing. But Nano Banana might be my
01:07:40
favorite app of the year. I think it's an incredibly fun
01:07:45
in a generation tool. It's so quick.
01:07:48
It really understands what you're asking for It I think
01:07:51
really is that was like the sort of lead piece of the Google
01:07:56
comeback, quote UN quote comeback story was the
01:07:57
effectiveness of Nano Banana. And they still got it, Dennis
01:08:02
still. He's still got it.
01:08:03
NVIDIA is up 36.5% this year, so not bad, but it's not not as
01:08:09
sexy as it as it had been. It's up over five years.
01:08:14
It is up 1350%. So yeah, we, we got so much
01:08:21
growth in what, 2024? The 2025 for NVIDIA was just
01:08:28
not, not as exciting. But Google, Google has been
01:08:31
insane. And we're getting a little
01:08:33
defensiveness from NVIDIA and their public, from their public
01:08:37
relations department. All right, our final quote for
01:08:41
2025 quote #18. I cannot imagine having gone
01:08:45
through figuring out how to raise a newborn without ChatGPT.
01:08:50
Do it again, he's not that bad. I cannot imagine having gone
01:08:56
through figuring out how to raise a newborn without ChatGPT.
01:09:00
That's Sam. Is that a better Sam?
01:09:01
I don't know. I mean, it's usually a little
01:09:02
more muted and and kind of confused.
01:09:05
He's totally reinvented his voice.
01:09:06
That's Sam Altman. We refer to him as Sam.
01:09:08
We've literally gotten questions from YouTube commenters being
01:09:12
like who is Sam? I think you can default.
01:09:15
Find another podcast Sam Sam. Altman, but but yes, there are
01:09:19
other Ganges. From from Lord of the Rings,
01:09:23
Sam, do you think we're talking about?
01:09:25
Sam Altman on The Tonight Show with Jimmy Fallon, December 8th,
01:09:27
2025. Madeline, why was this our last
01:09:29
quote of 2025? It was, you know, Sam Altman go
01:09:33
on TV and act human challenge. Oh failed.
01:09:37
It was you know, oh, every time we have these guys, you know,
01:09:40
come and defend. I mean, so much of this list was
01:09:43
really the AI industry coming out and saying why they're
01:09:46
important, why they should matter political leaders saying
01:09:49
why they should matter defending them as well.
01:09:51
This was, you know, to the normies, The Tonight Show we're
01:09:54
going on middle America network talk show.
01:09:58
Let's chat. Why is AI so great?
01:10:00
Oh this thing, to your point Eric, that we humans have been
01:10:04
doing for millions of years. Can't do it without AI.
01:10:07
Real relatable content there. Which is funny because on this
01:10:11
podcast I often invoke using Chachi BT for my newborn.
01:10:15
So I'm sympathetic that it is a useful thing.
01:10:19
But just saying it's the hyperbole of the fundraiser.
01:10:22
Sam Allman, right? He's he gives these maximalist
01:10:25
things to get people to give them his money.
01:10:28
And then sort of suggesting that you like needed Chachi BT to do
01:10:33
the most essential of human tasks.
01:10:36
Raising your child just shows the level of disconnection from
01:10:40
real humans. That doesn't make you a great
01:10:42
spokesman on The Tonight Show. Yeah, I think 26 is gonna be a
01:10:49
very critical one for the AI industry as we are well past the
01:10:53
point of like fun, novel growing technology.
01:10:57
Open AI is probably going to reach ChatGPT is going to reach
01:10:59
a billion weekly users some point early next year.
01:11:02
And their big battle is going to be getting people, the normies
01:11:07
comfortable with this technology and and wanting to use it
01:11:10
everyday and finding utility in it so that they're willing to
01:11:13
pay money and, you know, make this company at some point
01:11:16
financially stable. And if you can't do better than
01:11:19
Sam did on probably the softest possible interview in in media,
01:11:24
then it's not going to be an easy task.
01:11:28
Yeah, Jimmy Fallon really known for his hard hitting
01:11:30
interrogative questions there. Yeah, the guy's.
01:11:34
Donald. Trump's hair.
01:11:37
Maybe he should have done that for Sam.
01:11:39
Yeah, he should have. Yeah, that would have been good.
01:11:41
Should. Have ruffled ruffled his hair,
01:11:42
but I don't know. All right, those are the 18
01:11:44
quotes that defined 2025. Thanks for listening.
01:11:48
See you in 2026. Bye bye.
01:11:53
Thank you for tuning into this week's episode of the podcast.
01:11:55
If you're new here, please like and subscribe.
01:11:57
It really helps out the channel. Listen in for new episodes every
01:12:00
week, wherever you get your podcasts.
