Inside the AI Gold Rush (One Year Later)
Newcomer PodOctober 31, 202501:10:1464.3 MB

Inside the AI Gold Rush (One Year Later)

Click here to learn more about MongoDB for Startups - https://www.mongodb.com/lp/solutions/startups/partners?utm_campaign=startup_partner&utm_source=newcomer&utm_medium=referral

This is the leadup to our newest installment of the Cerebral Valley AI conference we're hosting on November 12th. In this episode of the podcast, we're joined by the co-hosts of the conference - Max Child and James Wilsterman - as we dive deep into the complete state of the tech startup world. We'll also revisit our annual AI Fantasy Draft and check out how each portfolio performed.

🎙️Topics in this episode:- Where AI investment headed in 2025 and where it's heading in 2026

- The challenges and opportunities emerging from the AI boom

- The latest updates from our Annual AI Fantasy Draft League: who’s leading, who’s lagging, and what bets might pay off next


00:00:00
Artificial intelligence. AI AI AI AI AI AI AI.

00:00:04
AI 2025 has been another crazy year for artificial

00:00:08
intelligence. We've seen the rise of thinking

00:00:10
models. We've seen video creation,

00:00:13
including Open AI's fictionalized version of it.

00:00:16
We are about to host the Cerebral Valley AI Summit, our

00:00:20
ultimate insider AI event for startup founders and investors.

00:00:25
I host that event with Max Child and James Wilsterman, my dear

00:00:28
friends and the cofounders of the voice AI games company

00:00:31
Volley. They're coming on the show to

00:00:33
pick up where we left off on our AI startup draft.

00:00:37
We've been picking the biggest names in AI startups, and it's a

00:00:41
good way to learn about the top Silicon Valley startups that

00:00:44
insiders are paying attention to.

00:00:45
So we're going to have a fun time picking up new companies

00:00:48
and dropping them from our list. But first we will discuss some

00:00:51
of the key themes in AI going into Seruville Valley Volley

00:00:55
thinking through what's really happened over the last 12

00:00:58
months. So give it a listen.

00:01:07
All right, let's get into it. We've got the Srebul Valley AI

00:01:12
Summit coming up November 12th. I'm here with Maxwell and James

00:01:16
Wilsterman, my Srebul Valley Co host, good friends and the Co

00:01:20
founders of the voice AI games company Volley.

00:01:25
We have two main objectives for this episode 1 themes, which is

00:01:30
half preparation for the event for ourselves and half just sort

00:01:33
of like, OK, where where are we in sort of the zeitgeist of AI

00:01:37
going into this event? And then we'll see what comes

00:01:41
out of it afterwards. And then to our favorite part of

00:01:45
the show, we're going to update our our drafts in a second.

00:01:49
So hey guys, welcome to the show.

00:01:51
Hello, Eric. Glad to be here.

00:01:52
Hey, Eric, hope you're doing well.

00:01:55
Yeah, I've got a screaming child.

00:01:58
We're a day. We're 28.

00:01:59
Days the draft gets. Heated you get in.

00:02:02
In trouble over there. Yeah, keep it down.

00:02:04
No, I have enough distance that I think I'm far away, but a

00:02:07
screaming child can can carry quite far.

00:02:11
Great. And you guys have been dialed in

00:02:13
thinking about sort of the big theme.

00:02:16
So let's let's get into it. Yeah, I think the goal here is

00:02:21
everybody come with at least two, if not sort of three themes

00:02:24
that you think will stand out in AI generative AII mean we do

00:02:30
Cerebral Valley twice a year. We did London mid year, we do

00:02:34
San Francisco in November. So these are themes, you know,

00:02:38
meant to, I think look at like a six month window or sort of

00:02:41
what's, what's the mood for the next 6 months in AI Max?

00:02:46
I know you've thought about this a lot you.

00:02:47
Want to go first. I mean, I have a bit, I have a

00:02:49
bit of a look back and a look forward here in terms of 2025

00:02:53
and beyond. I think that what's interesting

00:02:55
about this year is there were a lot of kind of industry

00:02:58
prognosticators trying to call certain themes for 2025 back in

00:03:02
like January, February, March. I remember distinctly I had some

00:03:07
very intimate conversations with folks who worked at Open AI.

00:03:10
Who? Intimate.

00:03:11
Intimate they. Were intimate.

00:03:13
You're like in Asano. Yeah, yeah.

00:03:16
Yeah, Yeah. Let me tell you.

00:03:19
The Gary Chan sauna. A couple cocktails, sauna

00:03:23
action. It was a great time.

00:03:26
I won't say whose house it was at anyway.

00:03:29
Yeah. And they basically were like,

00:03:31
there's going to be two big themes for 2025.

00:03:34
And I don't know if you guys want to guess these in advance.

00:03:37
They were like the two big themes are 2025 is the year of

00:03:40
agents and 2025 is the year of voice.

00:03:45
And my contention would be that both of these themes turned out

00:03:49
to be bullshit right too early. I mean, like, I think that we

00:03:54
talked about this a year ago on the podcast.

00:03:56
I don't even think we'd seen a quote UN quote agent at that

00:03:58
point. And I think at this point you

00:04:00
could argue maybe we're starting to see the contours of what

00:04:03
people would define as an agent, but it's almost exclusively

00:04:05
encoding and then voice. Look, James and I are obviously

00:04:10
voice bandwagon ears. You know, we may have hand

00:04:13
constructed the voice bandwagon out of Wood and Nails about 8

00:04:17
years ago, but I don't really think 22 in the voice either.

00:04:21
I think there's been progression in voice recognition in TTS, 11

00:04:25
labs, evaluations, things like that.

00:04:27
But I certainly wouldn't say we're all spending every day

00:04:29
talking of voice AI and even the products that we use I don't

00:04:33
think have made maybe as much progress as we would have

00:04:35
expected. So that was my initial framing

00:04:38
for this, which is those were the two expected themes for.

00:04:40
The what didn't happen? Yeah, it didn't happen, James.

00:04:42
Your reaction to the ones. That did happen, but yeah, go

00:04:44
ahead. Yeah.

00:04:45
I mean, I I agree, I think, I think that a lot of companies

00:04:49
built out foundations for agents.

00:04:53
I think Open AI just came out with their kind of agent's drag

00:04:58
and drop framework and there have been a lot of other

00:05:01
companies entering that space. There's like an agent's SDK from

00:05:06
Open AI and there's other companies that have these now.

00:05:09
But I agree, I haven't seen like deployment or scale of these in

00:05:16
any meaningful way. I, I think the fact that we're

00:05:20
talking, that people talk about agents is itself a sign that

00:05:24
things aren't where they need to be, right?

00:05:26
Like coding customer support, like if you're succeeding, you

00:05:29
just say, oh, there's this great use case.

00:05:31
And like, I, I feel like you don't start talking about SAS

00:05:34
and software as a service until it's like, OK, you've got some

00:05:37
products people can point to. It's like you vertical SAS

00:05:40
becomes a thing because you've got like mind body and you say,

00:05:43
oh, you can, you know, provide for this sector, We'll do it for

00:05:47
everybody else. But I feel like with agents and

00:05:50
you know, this isn't the first time this ever happened in

00:05:52
technology, but people are like, you know, sort of in concluding

00:05:56
how all this can play out without sort of the first really

00:05:59
successful use of agents. Because we say coding, but even

00:06:04
with coding, it's like a lot of it is a sidekick, right?

00:06:07
And not necessarily like, you know, Devon running, running on

00:06:11
its own. Yeah.

00:06:13
I mean, I think that Andre Carpathy obviously had a very

00:06:16
notable podcast a couple weeks ago on Dwarkish podcast where he

00:06:19
talked about development and agents.

00:06:20
And I think the analogy he used was to self driving cars, which

00:06:24
I thought was a good one where but more like self driving cars

00:06:26
back in 2016 or 2017 where he's saying, OK, if I think of an

00:06:30
agent as essentially a smart intern who can go off and do

00:06:33
basically any sort of task that I assigned them.

00:06:36
Like, you know, it can do a little bit of stuff and you give

00:06:39
it a very specific use case and a very focused set of, you know,

00:06:43
check boxes it needs to check off.

00:06:44
It can occasionally go off and do interesting things.

00:06:47
But you know, his point was what happened in self driving cars is

00:06:50
we, we, we ended up having to go through what he called like the

00:06:53
March of the March of nines, the March of nines being what

00:06:57
percentage accurate is this thing right?

00:06:59
And we started with 90% and then we got to 99% and then we got to

00:07:03
99.9% and then 99.99% and so on and so forth.

00:07:06
And ultimately to be superior to a human, you know, Waymo, you

00:07:09
maybe need 5 nines or six nines or 7 nines or whatever, right?

00:07:13
And he's saying, hey, agents are maybe at the first nine, you

00:07:16
know, today, maybe this, you know, you could argue the second

00:07:19
time, but I don't even think you could argue that really.

00:07:21
And we're probably still 234 more kind of nines away and and

00:07:25
maybe things are getting 10 times better every year.

00:07:27
So that's only three or four or five years, but he put more like

00:07:30
a 10 year timeline on sort of fully, you know, human capable

00:07:34
level agents. Why do you think companies like

00:07:37
Open AI are moving so fast in this direction?

00:07:43
Building out the SDK, is that the the no code interfaces the

00:07:48
browser agents within their, their their Atlas browser.

00:07:53
I guess like why? What is it marketing or is is

00:07:57
there a belief that this is further alignment?

00:07:59
It is or or you know, a different take on how the

00:08:03
timelines might look. I mean, my argument would be

00:08:06
it's competitive defense, basically.

00:08:07
Like there's people doing these little agentic drag and drop

00:08:10
frameworks and they're like, we better do that.

00:08:12
And then there's people doing, you know, browsers and they're

00:08:14
like, maybe the browsers will be a thing.

00:08:16
I guess we better fucking do a browser.

00:08:17
You know, like it's like, that's my argument.

00:08:20
I could be wrong. You know, obviously they're much

00:08:22
smarter than we are. They're smarter than I am.

00:08:24
Like they figured out a lot of stuff that we would never

00:08:26
figured out. So maybe they see the future,

00:08:28
they see the 2-3 year agentic vision and it's it's, you know,

00:08:31
they're building towards that right now.

00:08:33
I think. Yeah, maybe the more charitable

00:08:34
interpretation is they can see where this thing's going to be

00:08:37
in three or five years and that we're putting all the

00:08:39
foundations in place today, but it doesn't seem like they work

00:08:43
that well. Like I have not heard of anyone

00:08:44
using this agent drag and drop composer thing for to build an

00:08:48
agent in the last four weeks since they launched it.

00:08:50
Have you? I mean, I'm I'm it like

00:08:52
basically vanished without a whisper in in my my social

00:08:55
feeds, I guess. I'm going to offer a theme, I

00:08:59
guess in it feels like it it follows the same pitfalls of the

00:09:04
ones that you said didn't come true.

00:09:06
It's sort of like it's you know, it's going to be voice, it's

00:09:09
going to be agents. And I'm going to say like

00:09:13
fiction frenzy, you know, it's with the Sora creation and just,

00:09:17
you know, and, and also speaking to your 999 issue, it's like,

00:09:21
OK, what doesn't require stuff that's like, you know, totally

00:09:27
accurate works well. Like, you know, the problem with

00:09:29
agents is compounding. It's like you, you make a

00:09:31
mistake here. You have even more mistakes down

00:09:33
the line. So just just the, the errors are

00:09:36
are huge. Where stuff where it's like

00:09:38
create, you know, regular people create content, fictionalized

00:09:43
stuff. I think, I think that's has a

00:09:47
lot of promise. And so, and then everybody

00:09:49
copies open AI and so they have their Sora app.

00:09:52
It got a lot of downloads and it's like, oh, that can work.

00:09:55
You know, we should, we should lean into that.

00:09:58
And the last thing I'll offer on this theme, maybe it's an overly

00:10:01
expansive view, but I'm trying to also capture the clearly

00:10:05
serious like mental health thing and the sort of like people

00:10:09
spinning up their own worlds and how that's going to loom in the

00:10:13
AI story too. It's like you've got, it's sort

00:10:15
of, this is sort of my consumer, consumers going wild with what

00:10:19
is possible with AI and who cares how accurate it is at the

00:10:23
moment. Why is that a tale of things

00:10:26
that didn't work out? Or how are you thinking?

00:10:29
Of that just because it's a very like oh, we're going to build a

00:10:31
sore app or you know, it's like you see a glimmer of something

00:10:34
that's like you see coding and so we're all going to have

00:10:37
agents in every category you see you know, maybe a little funding

00:10:40
going to voice. I just, I just think it's the

00:10:42
sort of like you got, you got a sign of some success and then

00:10:46
the AR world is desperate to see more.

00:10:49
And so you're saying, oh, there's going to be so much much

00:10:51
building there. So I think it's just reading a

00:10:54
little bit too much maybe from one area of success.

00:10:58
But I mean, you guys are in the you're trying to do fiction AI

00:11:01
stuff. It's your.

00:11:02
Yeah, it's possible. Yeah.

00:11:03
I mean, I was gonna sort of say, I think that AI video is the one

00:11:06
that kind of came out of nowhere this year as like an actual

00:11:09
theme of the year in some ways. Whether or not you think it's

00:11:11
good enough yet, I think that's an interesting argument.

00:11:14
But I think that VO3 in May and then SORA a month ago definitely

00:11:20
like sort of felt like minor nuclear explosions in this sort

00:11:24
of consumer scene, right? Especially the especially the

00:11:27
Sora era, where you could RIP off classic copyrighted IPS.

00:11:32
Yeah, for like a week or five days or whatever it was.

00:11:35
Make a SpongeBob episode of yourself and any character from

00:11:40
history you wanted. And I just felt like that those

00:11:44
seem to achieve actual consumer breakouts in a meaningful way.

00:11:47
And and they fell off a big Cliff, I think when they pulled

00:11:50
all the copyrighted material out.

00:11:53
But I do think that AI video is something that has kind of

00:11:56
crossed the chasm into actually useful now, whereas maybe a year

00:12:00
ago it was sort of more of a curiosity.

00:12:02
I think again, you can make the case that it's not a part of our

00:12:05
everyday lives or whatever, but it feels like it made a big

00:12:07
progress. And then I do think the AI

00:12:11
psychosis theme, which is a little more of a negative theme,

00:12:14
is a real 1 and probably a big part of the future of our

00:12:17
lifetimes is this sort of AIAI induced psychosis and AI induced

00:12:22
dream worlds and virtual friends and can they drive people off

00:12:26
the ledge and all this kind of stuff.

00:12:27
So kind of a sad one, but probably a very real part of the

00:12:32
future. And do you guys think that based

00:12:36
on what you're seeing that is happening at large scale like

00:12:39
the AI itself is the cause of psychosis or do you think?

00:12:44
Well, wasn't Sam Altman? Seems like he's owning this

00:12:46
issue A. Little bit, yeah.

00:12:49
Yeah. I mean, I think he was saying

00:12:50
that very recently. Yeah.

00:12:52
I mean, I was trying to read what he was saying, and a lot of

00:12:56
it was kind of couched in this idea that they have massive

00:13:01
scale. They have 800, right?

00:13:04
At scale, all humans problem. Human problems are yeah, open AI

00:13:08
problems like. What percentage of non AI

00:13:12
conversations every day show signs of psychosis in the world?

00:13:15
Right? Right, A lot.

00:13:17
Exactly. It's just, yeah, we just have an

00:13:19
active psych exam. Yeah, now we have someone

00:13:22
running. All the time we.

00:13:23
Can actually monitor this. So I'm just curious to see the

00:13:29
causality evolve. I think the current chat bots

00:13:33
would be net good in that they're trying to ground people

00:13:36
in reality. If anything, I'm concerned about

00:13:39
where we're going, which is this embrace of what the social media

00:13:43
companies did, which is moving. You know, Open AI, Anthropic and

00:13:48
everybody else wanted accurate answers.

00:13:50
You know, that works in the enterprise.

00:13:52
That was sort of what researchers building.

00:13:54
But obviously the social media world learned, oh, we want

00:13:56
engagement. Who cares?

00:13:58
The humans can decide if it's accurate.

00:14:00
And so, yeah, we get things can get worse.

00:14:04
But right now, if the models are trying to be accurate, often to

00:14:07
me, it's bringing me back towards reality.

00:14:10
You know, it's like, OK, no, don't worry too much about that.

00:14:13
This is what's normal. I think they're that's true, but

00:14:16
I think clearly like especially pre GPT 5 and maybe in some

00:14:24
other apps, like you definitely saw models that had that

00:14:27
sycophant C problem that would like, you know, eventually go

00:14:31
off the rails as the conversation and the context

00:14:34
window got longer and just start agreeing with you.

00:14:38
So that I think is the risk vector that still exists, you

00:14:43
know, but even even if the models you know on your first

00:14:48
attempt are pushing back. For anyone running a startup

00:14:51
right now, Mongo DB for Startups is one of those programs that's

00:14:54
looking to make the life of a founder easier.

00:14:57
It's basically a shortcut for founders who want to move faster

00:15:00
and spend less. You get free credits on Mongo DB

00:15:03
Atlas, so your products ready to scale from day one without

00:15:06
worrying about database costs. They also throw in 200 million

00:15:10
Voyage AI tokens so you can start experimenting with AI

00:15:13
features right out of the gate. You'll get one-on-one time with

00:15:16
actual Mongo DB engineers, people who can help you

00:15:19
troubleshoot, optimize, and think through your next stage of

00:15:22
growth. There's even Co marketing and

00:15:24
partner opportunities, which means more visibility and

00:15:26
potential customers without extra spend.

00:15:29
If you're building, scaling, or just trying to stretch your

00:15:32
runway a little longer, Mongo DB for startups can actually help.

00:15:36
Head to mongodb.com/startups to learn more.

00:15:41
I think a theme for the conference this year will be how

00:15:45
much AI is holding up the US economy and whether I think

00:15:55
there's a lot of downstream questions from that.

00:15:56
But like, you know, whether it's a bubble, especially in the AI

00:16:01
infrastructure build out, whether AI is going to cause

00:16:07
bunch of massive job losses and anytime soon, you know, how

00:16:12
does, how does the the music stop or how does, how does this

00:16:16
whole thing wind down? And if you didn't have these

00:16:21
massive data center build outs, would the entire political

00:16:26
situation in the US be different?

00:16:28
Would you know, the right, you know, would we be in a

00:16:32
recession? And then also like to what

00:16:35
degree is there, to what degree is there round tripping of

00:16:38
investments between all these companies that is risky?

00:16:44
I mean, the debt stuff is getting crazy huge amounts of

00:16:47
money. You know, the first Cerebral

00:16:49
Valley was March 2023. And there if you were in that

00:16:54
room, I should have been more entrepreneurial or, but you, you

00:17:00
could have made a lot of money just being like, oh, these

00:17:01
people really see where the world's going.

00:17:03
Let's like invest in NVIDIA. You know, people did make a lot

00:17:06
of money doing that, you know, as you'll sort of see from our

00:17:10
draft picks later in the episode anyway.

00:17:14
But I think this I think this one, I don't think this is

00:17:17
necessarily the end of the bubbleimean.com ran for years

00:17:20
and years. And obviously like, you know,

00:17:22
some of the things sustained in bubbles, you know, it's like you

00:17:25
people start shouting bubble when you're when you're at

00:17:27
evaluation that might not, you know, they might be below where

00:17:33
the bubble ultimately falls too. So it's like there's a run up

00:17:36
and then there's a big fall. People hate the fall.

00:17:38
So I don't know where we are, but I do think this one, it'll

00:17:43
be interesting to see who plays the role of like cheerleader.

00:17:47
It's like, yeah, things are good.

00:17:49
Like don't the party, we need this party to keep going here.

00:17:51
You know, this sort of, there'll be the true believer

00:17:54
cheerleader. There'll be this sort of

00:17:55
disingenuous cheerleader where it's sort of like my job in this

00:17:58
whole business and my employees hang on me saying this thing's

00:18:01
great. And then I think, but I think we

00:18:03
attract and I think this will be a large segment of sort of the

00:18:06
realists. And this is one of the reasons I

00:18:08
like AI more than crypto. I think it's less cult like.

00:18:11
And I think there are lots of people that are like, I'm doing

00:18:14
my smart enterprise thing. We're selling to reasonable

00:18:17
customers. And yes, obviously I'm happy

00:18:20
that open AI might get, you know, 10% better because of

00:18:26
wasteful, you know, infrastructure spending, but

00:18:28
it's not necessarily how I'd run my business.

00:18:30
So I I think there'll be voices of caution.

00:18:31
I don't know what do you guys expect I guess in terms of.

00:18:36
Hype or bubble? Bubble caution.

00:18:38
I mean, I do think it's become like the intellectually en vogue

00:18:41
thing to like talk about the bubble and like complain about

00:18:43
the bubble and everything. I'm not totally sure that

00:18:47
anyone's like investments are really indicating that they

00:18:53
believe there's really a bubble or that they believe it's going

00:18:56
to pop anytime soon, I guess would be.

00:18:57
Investments in terms of venture funds companies venture.

00:19:00
Funds LP's, you know, to your point about crypto, it's not

00:19:05
sort of all about like exiting with the, you know, with the

00:19:08
bags before everyone else like it gets scammed, right?

00:19:11
Like I don't think people are selling out of these really hot

00:19:14
companies very much and I don't think that people are, you know,

00:19:19
not doing the next round at the next craziest evaluation yet in

00:19:22
any meaningful way, right. I think that, I mean, Tyler

00:19:26
Cowen always has this comment about how, well, if you actually

00:19:29
believe the US political system is gonna collapse, why aren't

00:19:32
you short the market? Basically, it's a big comeback,

00:19:34
right? And it's like, there are

00:19:35
different ways to analyze that question.

00:19:37
But I think that in AI, it's like, okay, well, if you

00:19:40
actually believe we're near the peak of a bubble, you can't

00:19:43
really be short necessarily, but you could be secondary ING out.

00:19:46
You could be not participating in the next hot funding round.

00:19:49
You could like, you know, you could be doing various things

00:19:51
that would indicate that you actually believe we're near the

00:19:53
top. And it doesn't really seem like

00:19:54
anyone's doing that. I mean, you're more of an

00:19:56
insider than we are, Eric, but that from the word on the

00:19:58
street, that's not what. I'm my, I think this thing's

00:20:01
still got juice, you know, and I because I talked to, I talked to

00:20:05
skeptical people, you know, I have, you know, lots of the

00:20:09
investors I like to talk to investors who.

00:20:11
Do care about market timing and sort of say, Oh yeah, you needed

00:20:15
to not deploy a ton in 2021 if you want to make a bunch of

00:20:18
money. You know, like those types of

00:20:19
investors are super useful. And like, I think they're like,

00:20:24
the party's still still roaring, you know, I mean, just, you know

00:20:29
what? Amazon's laying off like 10% of

00:20:31
their corporate staff. Like if it does, if new jobs are

00:20:36
going to AI, if consultants are being taken by AI, you know,

00:20:40
like there there's you can do huge market sizing on this

00:20:44
stuff. There's a funny case study I saw

00:20:49
this is one of these it was on Twitter things.

00:20:51
So I should be a little more responsible.

00:20:52
But somebody was like some, you know, a company was going to buy

00:20:56
AI startup and then the consultants like as part of the

00:21:00
acquisition made a demo product that was like supposedly as good

00:21:06
as the start up in like 2 weeks. And I is that pro AI or anti

00:21:11
anti AI? Because on the one hand it's

00:21:13
like, OK, you don't need to buy this.

00:21:16
You know, there are lots of like fake companies that could be

00:21:18
bought. On the other hand, it's like,

00:21:19
oh, you think it's a valuable thing that can be built in just

00:21:22
two weeks? Like it.

00:21:24
Yeah, I don't know. It seems like there's a lot of

00:21:25
stuff stuff coming. Well, yeah, I mean, the numbers

00:21:28
that people are able to raise for compute and scaling and

00:21:31
everything are still just out of control.

00:21:32
I mean, there was that like rumor 18 months ago.

00:21:35
Remember that? Like Sam Alman was trying to

00:21:36
raise $7 trillion for scaling, right?

00:21:40
And people were like, ha, ha, ha.

00:21:42
That's so ridiculous. Like $7 trillion is like, is

00:21:46
like 1/3 of USGDP. It's like 1/3 of the US economic

00:21:51
output. It's just like Sam Alman

00:21:52
personally, it's gonna like fundraise 1/3 of U.S. economic

00:21:55
output for some some microchips. And then I saw someone check in

00:22:00
on this recently and they're like, well, if you put together

00:22:02
all these like Stargate commitments, NVIDIA and AMD and

00:22:04
Broadcom and all this shit, they're like they're up to like

00:22:07
1.5 trillion now. And it's.

00:22:09
Like right? And Sam Ullman, I think he

00:22:11
basically said like, yeah, the 7 trillion people thought that was

00:22:15
too absurd. So I had to like, come up with

00:22:16
like a smaller one that people, you know, it's like clearly it's

00:22:19
just like, yeah, we need a lot of money.

00:22:21
I'm trying to reset expectations. 1.5 trillion in

00:22:25
fundraising right over whatever 12 to 18 months.

00:22:29
Yeah, there are a lot. It's also sloppy.

00:22:31
Like sure of. Course, but still a lot.

00:22:33
I think it's big, Yeah, we can all agree.

00:22:34
That and definitely the economy is, you know, the stock market

00:22:37
is very is in, you know, upon it, yeah.

00:22:40
We've just never seen this level of fundraising in any of our

00:22:43
working lifetimes, right? You know?

00:22:46
So TBD if that means they can only go.

00:22:49
Higher you saw the cursor. But.

00:22:51
This is this is the the ultimate insider bubble work.

00:22:54
The cursor lost one of its Co founders or I do think there are

00:22:57
some of these where it's like such high flying valuations they

00:23:00
raise around and then like they're going to lose a Co

00:23:03
founder. I mean, I don't know the inside

00:23:05
story there, but there, there clearly are going to be, you

00:23:09
know, some of these AI startups that get to raise on.

00:23:13
It's a good, good time to be a journalist, though this doesn't

00:23:15
happen A lot of juicy stories that we'll.

00:23:18
Log on, yeah. I like that your sign of the the

00:23:23
the party continuing, Eric, is the more layoffs we we see in

00:23:27
the US economy. They're pretty hilarious.

00:23:30
They're driven by AI, right? I don't know.

00:23:32
That's a good question. Like if it's Amazon, you know?

00:23:37
Yeah, or or is it just? I do have like, I mean, I do

00:23:39
have two more like sort of real themes I actually think did

00:23:43
happen this year, if that's helpful.

00:23:44
And try to stick with the stick with the term.

00:23:47
So I called mine Fiction Frenzy, I think.

00:23:50
I told mine I called mine AI. Is AI holding up the US economy?

00:23:56
Too. Long.

00:23:57
Too long. Well.

00:23:58
Mine are. Yeah, U.S. economy hangs in the

00:24:02
balance. Yeah, anyway.

00:24:03
My themes are all internally consistent, which is they're all

00:24:06
the year of X, but it's is it the year of X?

00:24:09
We really didn't coordinate on editorial style.

00:24:12
All right, Max. So, so I had, I had the Year of

00:24:15
Agents and the Year of Voice both as being kind of bullshit

00:24:18
themes, but I think there are fake themes.

00:24:20
Yeah, there are two themes that I think were very real themes

00:24:23
this year and saw, you know, real contributions to the way

00:24:27
people work. Maybe ultimately to Amazon

00:24:29
layoffs, like, I don't know. But I think, you know, the the

00:24:33
number one thing to me that was sort of thematically important

00:24:37
this year was the year of the thinking models, right, the

00:24:40
reasoning models. And I know this isn't new news

00:24:43
anymore, but you know, O one was only released like 11-12 months

00:24:47
ago O3, which is the first thinking model that could

00:24:50
incorporate links and actually start referencing real

00:24:52
information. And then ultimately, you know,

00:24:54
we got GPT 5 with the thinking version of that, you know,

00:24:58
Claude Opus, all this kind of stuff, right?

00:25:00
We, we actually made this dramatic architectural

00:25:02
transition in, in terms of how LLMS work in the last, you know,

00:25:06
6 to 12 months that I think took them to a whole nother level of

00:25:10
intelligence and utility. And at least in my personal

00:25:13
life, you know, as a startup founder, whether it's, you know,

00:25:15
doing product planning or writing memos or researching

00:25:19
things or writing emails or whatever, you know, any of the

00:25:22
million things, you know, you could use an LM for.

00:25:24
I just, I feel like my personal usage has gone up like 10X since

00:25:27
3 launched essentially in March or April because I feel like the

00:25:30
utility of these models with thinking combined with sourcing

00:25:36
leads me to find them. I've seen an inflection point in

00:25:40
the value I'm getting out of these models.

00:25:43
And I think that, you know, if anything sort of real happened

00:25:45
to this year, I, I think it was the transition to thinking

00:25:48
models that could, you know, reference, make references and,

00:25:52
and actually sort of be accurate in a lot of ways since.

00:25:54
And maybe that's what's leading to these job losses or these

00:25:57
crazy things you could do with these models now.

00:25:59
Astute No, no argument. I mean it it it's just how hey,

00:26:04
I move so fast. I'm like old news and you know,

00:26:06
like what, 11 months ago? And it's true.

00:26:09
It's. Transformational in a 12 month

00:26:11
time frame, so. Transformation London.

00:26:14
I mean again, I I consider O3 just.

00:26:17
Even O3 which is less like 8 I think 8 months or.

00:26:20
Something. Yeah, yeah, eight months ago,

00:26:22
right, So. And this was one of Carpathy's

00:26:25
sub points, right? That there, there have been

00:26:28
these improvements and we still need to link all this data and

00:26:31
get all this. I mean, it's the, the theme

00:26:34
every year is sort of like, man, there, there's a lot of value

00:26:37
right there that we have that feels like we're not explaining,

00:26:41
you know, here's how you should use it.

00:26:43
I mean, at Dave's sex Medicina, the healthcare health event, we

00:26:46
did, you know, some of the some of these companies, I don't

00:26:50
know, are they that different than Chachi BT but they're

00:26:52
wrapping it in like doctors feel comfortable with it.

00:26:56
They're showing doctors the sources, they're making it work

00:26:58
for medicine. And like, this wasn't one of my

00:27:01
themes, but medicine is one of the areas where AI is going

00:27:04
super hot. Yeah.

00:27:06
And. And so.

00:27:07
Yeah. I think, yeah, I think, I think

00:27:09
similarly, you know, Harvey as the legal AI, the sort of, you

00:27:13
know, scuttlebutt 2 years ago was that it, it actually sucks

00:27:16
and it's not very good, you know, as a legal model.

00:27:19
And then I do think basically it got a brain transplant from

00:27:22
these thinking models in the last 9 to 12 months.

00:27:25
And I mean, my parents are both lawyers.

00:27:27
My dad's like, this is awesome. Like, I mean, like, it's

00:27:29
amazing. You know, I just feel like the

00:27:31
the buzz you're hearing on Harvey or similar legal models

00:27:34
is just way more positive in the last 6 to 12 months.

00:27:37
And I would argue it's all kind of downstream of how good these

00:27:40
thinking models are. I mean, I think ultimately it

00:27:43
gets into this question of if you have a text in, text out

00:27:48
only model, right? You know, do we reach AGI in two

00:27:54
to three years, right? Because I think that I think

00:27:56
that I'm pretty confident that we're nowhere near AGI when it

00:28:01
comes to visual interpretation, when it comes to multi step

00:28:06
processes, when it comes to audio and video, I think we

00:28:10
haven't really reached sort of quote UN quote AGI levels,

00:28:12
right? I mean, James and I were talking

00:28:14
about recently like, you know, I do these little basic visual

00:28:17
tests with GBT thinking, you know, the most advanced model.

00:28:20
I took a picture of a bunch of pomegranate seeds on a plate

00:28:23
because my daughter's willing to pomegranate seeds right now,

00:28:26
because we counted all the pomegranate seeds and there were

00:28:27
like almost exactly 100 pomegranate seeds on this plate.

00:28:31
You know, it was like 99. And I took a picture and it was

00:28:33
like, there are 76 pomegranate seeds on the plate.

00:28:36
And I'm like, no, they're not like we counted.

00:28:37
There's like, you know, a hundred, 101.

00:28:39
And then I was like, hey, we counted.

00:28:41
There's like 101. It's like, you're right.

00:28:43
It's probably a little more. I think it's like 82 pomegranate

00:28:45
seeds, you know? And you're just like, OK, it's

00:28:48
not 82, it's not 75. You're like like my 3 year old

00:28:51
can count pomegranate seeds better than AGI or whatever you

00:28:54
want to. Call it.

00:28:54
Do you think? Do you think that like will get

00:28:57
solved by models or do you think that will get solved by tool

00:29:01
use? Right?

00:29:02
Where you could imagine that GBT doesn't have a connective tool

00:29:05
to the exact right ML model that can count pomegranate seeds, but

00:29:12
like, it probably exists. Deeply wrong that it's so

00:29:15
confident when it doesn't have the tools.

00:29:17
So totally something is wrong there, like if you were just

00:29:21
dealing with another human or another.

00:29:23
They would say like, I can't count, I can't count.

00:29:25
They would just tell you. You don't know.

00:29:27
You know you get it wrong all the time.

00:29:28
People are shouting at you. They get it wrong.

00:29:30
Like no, it speaks to some breaking something.

00:29:33
Wrong. It was like this same day there

00:29:35
was some tweet about how it got a gold medal on the Math

00:29:37
Olympiad or whatever, which is like the hardest math test we've

00:29:40
invented for human beings. And then I'm like, it can't

00:29:42
count. Fucking pomegranate sea.

00:29:43
It's like, it's like, it's like worse than a three-year old.

00:29:45
This was solved by Justin Hoffman in The Rain Man Like.

00:29:49
This yeah. So my point to come back to this

00:29:53
is that I think for visual audio tool use whatever, you know,

00:29:56
James's category, there's just a long way to go in a lot of these

00:29:59
areas, right? But I think if you're like, OK,

00:30:01
text in, text out, you know, we're already at this gold

00:30:04
medal, you know, US math Olympiad level.

00:30:06
I think that, you know, O5O6O7, you know, I know they renamed

00:30:11
all these fucking things. So whatever numbering scheme you

00:30:13
want to take. But the next two or three

00:30:15
generations of models, I can see us getting to a text in, text

00:30:20
out, you know, AGI level of thinking, right?

00:30:23
Where I for me, the leap forward we took between GBT 4-O and O3

00:30:27
was the first time that I really believed, you know, I got AGI

00:30:32
pill that like, Oh yeah. As at least as a chat bot, this

00:30:35
thing is going to be, you know, much, much smarter than the

00:30:38
average human and basically able to do anything and potentially

00:30:41
break ground in terms of scientific discoveries and other

00:30:43
things like that. Related theme, My last theme

00:30:45
related. And I just wanted to add it on

00:30:47
as a sort of postscript. We sort of forget again another

00:30:52
old news thing that the term vibe coding was coined this

00:30:56
year. The idea?

00:30:58
Yeah, what? Karpati's tweet was like January

00:31:01
this year or something? January is not this year.

00:31:03
That's like a different. Year.

00:31:04
No, it was it. Was this year?

00:31:08
Isn't it crazy? Like, so I would just say as a

00:31:11
postscript, as an extension of the thinking model progress and

00:31:14
as a post, an extension of what we've seen with these text in

00:31:17
textile models. And I would call this the year

00:31:19
of vibe coding, right? I mean, this is the clearly the

00:31:22
mimetic phrase of the year. There's a million startups that

00:31:25
are getting crazy evaluations off of.

00:31:26
It, it, it is. And we were having Guillermo at

00:31:28
Versal and Amjad at Repplit, both on stage, different panels.

00:31:32
We had lovable at CV London, you know, exactly right.

00:31:35
I would say internally to our company that all of our

00:31:38
designers and product managers now believe they can code, you

00:31:41
know, websites and prototype things for internal tools like

00:31:45
we have. Clearly, you know.

00:31:47
Cerebral Valleys. Logos 3D AI generated or like

00:31:50
the design was human, but I think the animation was yeah.

00:31:54
So I would just say vibe coding as a phrase that has entered our

00:31:56
lives, you know, representing the idea of telling the coding

00:32:00
bot to what to do and then getting back results and sort of

00:32:03
riding the vibes, as Karpati said when he coined it was a

00:32:07
big, big deal and I think was a huge part of what we actually

00:32:11
saw happen in the world this year based on large models in

00:32:15
AI. So yeah, those are my 2.

00:32:16
I love how your strategy is look at the dates and be like some of

00:32:21
these things we take for granted are we were so greedy.

00:32:24
You know, it's like a lot has happened and it's still very

00:32:27
current. And so we're gonna be unpacking

00:32:29
those those themes. Well, well.

00:32:31
You spent all this time complaining about things that

00:32:32
were too early. I'm.

00:32:34
Saying no, no, no, no, I like it actually here.

00:32:36
I think it's it's a good point and you're differentiating

00:32:39
obviously agents and other stuff.

00:32:41
You know, we got I'm not I'm not criticizing.

00:32:43
This is just funny. Actually worked.

00:32:44
This year in my opinion, right? Yes, and date them to 2025.

00:32:48
Yeah, James. Yeah, I, I have a, a tangent out

00:32:53
of that theme for Max, I guess I'm calling it the the jagged

00:33:00
edge of AI. Essentially, if you're not

00:33:04
someone who's using these coding models or vibe coding all the

00:33:08
time or playing with the latest VO model like or Sora like,

00:33:14
you're just not really seeing the dramatic improvements.

00:33:19
So I think most people in the world are are, you know, maybe

00:33:23
using chat GPG 5 now and getting a little of this thinking and

00:33:28
reasoning, you know, experience, but not much beyond that.

00:33:35
And then they are thinking, oh, this is going to be another over

00:33:38
promising technology. This, you know, is kind of a

00:33:45
Silicon Valley pipe dream AGI. And then what you hear from the

00:33:50
labs themselves is like, Nope, everything's going pretty well.

00:33:54
Like on track to AGI, you know, we're gonna have automated

00:33:58
science in a couple of years. And but Dario made some

00:34:01
predictions that aren't really holding up right, like some of

00:34:03
these predictions. I think that.

00:34:04
What, didn't you say 90% of code was gonna be written six months

00:34:08
ago? I think that's not that far off

00:34:10
inside of anthropic like I think like.

00:34:12
This is like self driving. Just like definition question or

00:34:16
something or? No, no, I self driving where is

00:34:18
just like tomorrow, tomorrow, tomorrow.

00:34:19
We're almost there. I'm literally saying like I

00:34:21
think today you could would make an argument based on what I've

00:34:24
heard from Anthropic that you know 90% of the code is written.

00:34:28
You're saying it's a human problem.

00:34:30
The humans just haven't really figured out how to use the good

00:34:32
technology. But it's like positive.

00:34:34
Yeah, yeah, exactly. Like and I you hear from open AI

00:34:38
researchers and engineers that they like vibe coded or you

00:34:43
know, used codecs, their own, you know, version of cloud code

00:34:47
to build like a lot of the products that came out at

00:34:49
Devday. So I actually think there's just

00:34:53
like more adoption happening within labs than people realize

00:34:58
and like the people who are AGI pilled in lat in the labs have

00:35:02
like a good reason to be and and we're not seeing all of that.

00:35:05
Yeah. I mean, I, I agree with that.

00:35:07
I agree with the jagged frontier in general.

00:35:09
I think that it's, it's already not evenly distributed how good

00:35:14
these models are at some things. And, and I think particularly I

00:35:17
think the sort of GPD 5 launch to your point was sort of this

00:35:21
anti PR moment where people are like, well, AI like it's not

00:35:24
really any better now. It's just sort of like it's just

00:35:26
kind of sucks. And it's like, well, try

00:35:28
clicking the little. It's insane.

00:35:30
Yeah, yeah, it takes some. You have to do the work, you

00:35:34
know, you have to find the value, right, Chaji?

00:35:37
But it's insane, this picture that you're painting that it's

00:35:40
like, Oh yeah, they just have access to Chaji Buti and they

00:35:43
can't see the value. I'm like what?

00:35:45
Like they're. Do you agree with my my?

00:35:47
Sort of Astoria or version of that, that most people are,

00:35:51
yeah, kind of, yeah, to Max's point, disappointed by the the

00:35:55
ChatGPT 5 launch. Yeah, I think there are a lot of

00:35:59
sort of skeptics. It takes too much work.

00:36:01
Yeah, to get the value out of it.

00:36:04
And people need to be led to water.

00:36:06
That's what that's what I keep saying.

00:36:08
You know, it sort of needs to be handed to them.

00:36:09
Here's the thing it can do. I hear a lot from in even people

00:36:13
in Silicon Valley who say, oh, the labs are lying about AGI.

00:36:18
You know, if you look at what's opening eyes releasing it's, you

00:36:23
know, Sora and they're gonna just hack attention.

00:36:28
And now Sam is talking about erotica in ChatGPT.

00:36:34
So like, you know, it's all, it's all, yeah, it's all we're.

00:36:39
Talking about AGI less right the public commerce, it doesn't feel

00:36:43
like the labs are talking about AGI.

00:36:44
Don't agree with that. I think they, I think they still

00:36:46
talk about it a lot. Like I mean.

00:36:48
I think yeah, I think, I think they we.

00:36:50
Should gauge that I'm. Talking about it.

00:36:51
What? Live streaming?

00:36:52
Yeah, yeah. I mean one thing.

00:36:55
I mean Microsoft right? Like the they just announced

00:36:57
this? Yes, it's contractual.

00:37:00
Yeah. Microsoft had this deal with

00:37:02
Open AI and you know, some of the triggers are some of the

00:37:05
deal terms are conditional on AGI, but in some ways it goes

00:37:08
both ways because it's a deal. So Microsoft could be like AGI

00:37:11
is not close, that's good for us.

00:37:13
We have a better deal, you know, so maybe Open AI thinks it's

00:37:16
close. Now I guess Open AI can declare

00:37:19
AGI to get out of the contract and then there's like an

00:37:22
independent panel of reviewers that will decide if if AGI has

00:37:28
actually been achieved, which I thought was pretty interesting.

00:37:30
Here's the last theme that I'll be quick about.

00:37:32
Don't say EA, like don't say effective altruist.

00:37:37
I mean that not like we're bringing that up all the time,

00:37:40
but I think, you know, anthropic is obviously run away from that

00:37:43
term. You know, we had what we had

00:37:45
Holden right, who is a former kind of yeah, speak at one of

00:37:51
them. So it's been come up.

00:37:52
He's married to Danielle at Anthropic.

00:37:54
She spoke like Anthropic doesn't want to talk about effective

00:37:57
altruism anymore. And then there's this sort of

00:38:00
Trump world sort of thing. You know, you saw David Sachs

00:38:02
fighting with Jack Clark over having any sort of safety ISM.

00:38:07
And so it's beyond just don't say yeah, it's will people talk

00:38:10
about safety at all and how much in sort of a Trump world is even

00:38:16
just worrying about AI safety Lib coded and therefore nobody

00:38:19
wants to talk about it. Yeah, David Sachs went like

00:38:23
further and said the all of this was regulatory capture, which I

00:38:29
thought was just a very. There was like performative.

00:38:31
Jump like, like it's clearly not performative.

00:38:34
To your point. Like they've been talking about

00:38:36
this for a decade. Everything he does is

00:38:39
disingenuous. All his arguments are motivated

00:38:41
for some end, so he cannot comprehend someone having.

00:38:44
Yeah, yeah, argument just on principle.

00:38:48
All right, those are our themes. I'm giving myself the last word

00:38:50
on that one. Very excited for this next

00:38:53
segment. We're going to go into the

00:38:54
draft. We've been doing this for a

00:38:56
couple years now, so we already have our picks and we're going

00:38:59
to be sort of dropping and then picking and picking up some new

00:39:03
ones. And I'm very excited to say with

00:39:06
the upped production value on the Newcomer podcast, we have a

00:39:09
hype video to catch you up on what we've done to try and make

00:39:12
this comprehensible. So you can follow along.

00:39:15
This ultimate feat of nerddom, which is taking what people do

00:39:19
for cool things like sports and using it to pick not just AI

00:39:23
nerdy things, but companies. I feel like it's a double whammy

00:39:27
of Loserdum and that we're we're rooting for companies to succeed

00:39:31
and we're interested in in tech. So anyway, here's that hype

00:39:34
video. In living rooms everywhere, die

00:39:37
hard fans chase glory through fantasy football.

00:39:40
In newsrooms, armchair strategist bet the house on who

00:39:44
wins the Oval Office. But in Silicon Valley, these

00:39:48
insiders draft generative AI startups.

00:39:52
This is the AI Fantasy Draft League IN2023G Miss Eric

00:39:57
Newcomer, Max Child and James Wilsterman chose their five

00:40:00
startups. Open AI, Inflection, Character

00:40:03
AI, Glean and Mistral AI. Databricks, Pine Cone, Cohere,

00:40:07
Modular, and Imbue. Hugging Face, Anthropic AI 21

00:40:11
Labs, Replit and Adept. After a year of celebration and

00:40:15
regret. Dominating right now.

00:40:16
Bad pick, bad pick. In like 6 months that has just

00:40:19
totally changed the. GMs went back to the table to

00:40:21
evaluate their teams and prepare to add 2 second round draft

00:40:25
picks with Eric in the lead in 2024, followed by James and Max

00:40:29
struggling to hold on to his franchise death.

00:40:32
Defying public IPOs, market fluctuations and many rounds of

00:40:35
funding later, the boys are back in the hot seat once again.

00:40:39
All in the hopes of glory on November 1st, 2028 when the

00:40:42
Champions will be crowned by total market cap valuation.

00:40:45
Welcome to the 2025 AI Fantasy Draft League, presented in

00:40:49
partnership by Newcomer. Wow, production value is just

00:40:55
just skyrocketed. If if if that had come out

00:40:59
before TBPN existed, I would say we should start.

00:41:01
TBPN, not like TBPN invented production like Youtubers have

00:41:07
been doing this forever. Everybody's yeah.

00:41:09
But anyway, I'm we're hooked. Up ESPN before they did OK.

00:41:13
I know and Riley, my business guy, literally one of his, the

00:41:18
his main editorial insight the whole time he's been here, he's

00:41:21
been with me three years. He's been like, you need to be

00:41:23
more like sports. You need to be more like sports.

00:41:25
He's like and he's like be more correct about.

00:41:27
Yeah, I told. You I take this seriously.

00:41:32
So I feel like the sports thing is a little glib sometimes.

00:41:34
Anyway, I love I love that. That's great.

00:41:37
We should get into it. James.

00:41:38
You're the both a player, but also sort of the game master of

00:41:43
this. I don't know what the.

00:41:45
What do you call it when you run?

00:41:46
Missioner. Isn't there a term commissioner?

00:41:48
Commissioner. Yeah, OK.

00:41:49
Yeah. Yep, this is year three-year

00:41:55
three of the draft. We've been working on this for a

00:42:02
while now. We all have teams just to set

00:42:05
the the rules again. You know, we drafted a few teams

00:42:09
originally. Now there's like waiver pickups

00:42:11
and you can, you can add and drop teams.

00:42:14
We're drafting companies that have raised over $100 million

00:42:18
have a core use of generative AI or a core role in the generative

00:42:21
AI ecosystem. There are some excluded

00:42:24
companies if they're really focused on bio health, defense,

00:42:30
cloud computing or or chips or silicon we.

00:42:33
Block the whole chip infrastructure that which is

00:42:35
sort of funny in retrospect and now that everything's flowing

00:42:38
there, but but we we excluded that.

00:42:41
Yeah, we could always bring it back if you guys are don't think

00:42:44
we have enough companies to expect.

00:42:46
Yeah, I know we're not. We're not focused on robotics

00:42:49
companies or China companies based in China.

00:42:54
And yeah, it's a snake draft this this season, this year.

00:42:59
I always like to say I had to. I insisted that we have an

00:43:04
auction for who goes first, and we auctioned and I paid $75

00:43:08
billion so that I could pick Open AI first because my view

00:43:11
from the beginning was that Open AI was going to swamp it all and

00:43:14
you needed the winner. It's a home run business.

00:43:16
So far that strategy is doing well though, so.

00:43:19
Far, so far, Open AI is worth more than James and my team

00:43:23
combined this year, which is a perfect expression of power law

00:43:27
dynamics. Now you did pay a $75 billion

00:43:29
handicap, so it's a little bit under, but it's pretty damn

00:43:32
close, which is kind of mind. Boggling but it wasn't like oh I

00:43:35
just happened to go first by some random chance I paid for

00:43:38
the right knowing that we would fight for opening and.

00:43:41
You, your team's worth close to 510 billion right now, including

00:43:47
that handicap of 75 billion Max, your team's worth 150 billion

00:43:53
and my team is worth close to 400 billion.

00:43:57
Just to catch you up on last season, we we, we dropped a few

00:44:01
companies and we had some exit. So that opened up new slots on

00:44:05
our teams. I drafted Xai with the first

00:44:08
pick last year and that's been a huge win.

00:44:11
Now rumored to be valued around 200 billion.

00:44:15
Great fall in your lap sort of opportunity, I mean your first.

00:44:18
Is this the moment where I get to say I basically got Coin

00:44:21
flipped out of being in second place because if I had drafted

00:44:24
XAI I would be in second place by a mile just like James for

00:44:29
the. Record.

00:44:29
Fair, fair. Well, fortunately for you Max,

00:44:32
you get to draft first this year.

00:44:35
Yeah, I know. And this is the year where

00:44:36
there's no. Obvious wins.

00:44:39
Yeah, I know. This is oh wow, I get the first

00:44:41
pick of the year where everything's worth $10 billion

00:44:43
and nothing's worth 20. So I guess first to start, Max,

00:44:48
do you want to just quickly go, you know, run through your team

00:44:52
and then also let us know if you're dropping any of your

00:44:55
team? Sure.

00:44:57
My team today is Data Bricks, Cohere Modular Scale, Sierra

00:45:03
Sakana and Hebia. You know, it's not great.

00:45:07
Other than data bricks, I did get an exit with scale at $29

00:45:12
billion. Are we calling that an exit?

00:45:15
We basically I think. I'm calling that we, we've,

00:45:17
we've gone through this a couple times where there's been these,

00:45:20
yeah, pseudo exits count. So I'm counting that.

00:45:25
Yeah, again, I'm in last place by a huge margin.

00:45:31
Not not, you know, my team out on the field has just not really

00:45:33
been performing. Sierra is a good company, but

00:45:36
and Co here I think is gonna have a second I think they're.

00:45:39
Yeah, yeah, I think we are. There's some value here.

00:45:41
I think ultimately, you know, as I said earlier, had I been, had

00:45:45
I won the coin flip and gotten to pick Xai, I would be

00:45:48
breathing down Eric's neck kind of like James.

00:45:51
And instead I'm I'm a joke, you know, looming far in the

00:45:54
background. So with that, I'm gonna do some

00:45:57
aggressive restructuring on my team this year and do some heavy

00:46:02
dropping and try to pick up a lot of value in a hurry.

00:46:05
So I'm out on scale. Yeah, I'm also going to drop

00:46:09
modular. I'm also gonna drop Sakana and

00:46:12
I'm also gonna drop heavier, so I'm gonna make 4 picks in

00:46:16
addition to my 22 picks that I get as part of the draft this

00:46:19
year. So you get 4 picks or?

00:46:21
Yeah, you get 4 + 2. So you get 6 picks?

00:46:24
Yes, that's a lot, yeah. Yes, I'm taking, I'm shooting my

00:46:27
shot here at restructuring. Look, sometimes you gotta have.

00:46:31
A rebuilding year? Yeah, I think.

00:46:32
You gotta you gotta go all in, which is what I'm doing this

00:46:35
year, so we're really going for it.

00:46:38
Eric, you're gonna pick second this year.

00:46:41
Yeah, so my team, first of all, I've dropped nobody.

00:46:45
All my exits are from actual deals.

00:46:48
So I think that's another point in my favor.

00:46:50
But OK, I currently have Open AI, Glean, Mistral, Perplexity,

00:46:57
Safe, Super Intelligence and Harvey Kodium just exited.

00:47:03
So I'm slash windsurf, whatever you want to call them.

00:47:07
So James has that at 2.4 billion.

00:47:10
I've been scrutinized those small potatoes and then I've

00:47:15
already exited inflection and character.

00:47:19
So we I replaced those last time.

00:47:21
So the I have an extra spot thanks to windsurf.

00:47:25
But otherwise, yeah, I'm I'm holding on.

00:47:28
I I like I like mine. I mean, who knows what's

00:47:31
happening at safe super intelligence.

00:47:33
We're just betting on Ilia there perplexity, obviously lots of

00:47:37
zeitgeist glean. You know, I really believe in

00:47:39
the business. I just don't know, you know, I I

00:47:42
think that one has the lowest to fall.

00:47:44
It's just like, does it have the true insanity?

00:47:46
You know, it seems like this this thing is supposed to end

00:47:49
what we said 2028, November. It's like the bubble bubble

00:47:53
could still be going for all we know.

00:47:54
But glean, I feel like bubble or not, I'm, I'm, I'm betting on

00:47:58
that one. Yeah.

00:48:01
So I I like this team in Harvey. You know, we're putting them on

00:48:04
stage at St. Row Valley.

00:48:05
They're getting, they're getting just another boot, the little

00:48:07
boost that they need, you know, to really go to the point.

00:48:11
To your point, it'll be it'll be hilarious if in 2028 where the

00:48:14
bubble has popped and the stock market has collapsed and the

00:48:17
final scoring is really just about like whose businesses

00:48:20
collapsed the least in the. Yeah, bubbles look like human

00:48:24
made this fucking fortune zone at the right time, you know,

00:48:26
like yeah, yeah, your scale deal could be.

00:48:29
Like open AI will be like -400. Billion and like.

00:48:32
You know, James and I all have these like, little like cobbled

00:48:36
together. Stones like.

00:48:37
Companies can only go to 0. You can't be -400 billion.

00:48:41
I don't know what you're. I mean on your current valuation

00:48:43
like from 500 to 1. 100 Oh, I see.

00:48:45
Yeah. I'll draw from you.

00:48:46
I thought, I thought like, I just meant I thought like they

00:48:48
were gonna be in debt, you know, to, to the Saudis or something.

00:48:52
New to that? No.

00:48:53
No, no, I just meant, I just meant 400 down from today, not

00:48:57
not -400 valuation. We'll see.

00:49:00
Yeah, yeah. OK, so 3 picks for Eric, 2 new

00:49:03
picks and 1 exit. My team is Hugging Face,

00:49:08
Anthropic, Replit, XAI Runway 11 labs and poolside.

00:49:15
Getting some value from XAI obviously and Anthropic the rest

00:49:22
sort of small potatoes, but poolside is kind of rumoured at

00:49:26
that $12 billion evaluation recently, which I think is

00:49:29
interesting. 11 Labs just had a tender at 6.6 billion and runway

00:49:37
rumoured at around 5 billion. So some value here.

00:49:41
I'm definitely thinking about dropping Hugging Face this year.

00:49:45
That's I don't have any exits this year so I'm drop.

00:49:47
I'm going to drop Hugging Face. They just haven't raised.

00:49:50
Brutal. They just haven't raised, you

00:49:54
know? I never thought that pick made

00:49:56
sense. We talked.

00:49:57
We we should talk a little bit. When you we did kind of shit

00:49:59
talk that pick so. But so you guys were.

00:50:01
Ready. We also should talk to other

00:50:02
picks that I mean, I don't know what's going on there.

00:50:08
It's an interesting question cuz they were the hottest company in

00:50:11
the universe kind of when you picked them at the small scale,

00:50:14
like they were like really, really, really yeah, Zeitgeist

00:50:16
in their early stages and I don't know, they're sort of out

00:50:20
there. People put their models up on

00:50:21
Hugging Face, but maybe for some reason they have not gotten that

00:50:25
AI premium like everyone else. Yeah.

00:50:27
I mean, they did rate, I think they where, where were they at

00:50:31
when I drafted them? I mean and they've.

00:50:33
Weren't they at? Four, yeah, four, 4 billion, so

00:50:37
4 1/2, yeah. So I don't know, I don't, it

00:50:39
just seems like maybe hard to get another round done at a

00:50:42
higher valuation there. So I think that I'm going to

00:50:46
have to drop and free up a spot there.

00:50:51
The other ones are a little trickier because they're all

00:50:56
doing pretty well, but are they the next open?

00:51:01
AI poolside is a black box. Yeah, poolside's a black.

00:51:04
Poolside still feels in the bigger big good.

00:51:07
Blood still hot, obviously Amjad's coming to the

00:51:10
conference. XAI is great.

00:51:13
Runway's pretty exciting in the video space 11 Labs is raising

00:51:19
at seemingly every year at increasingly large valuations,

00:51:25
so that's a good one. I think I'm just going to leave

00:51:27
it at that and take 3 new 3 new picks.

00:51:31
Great. All right, let's do this thing.

00:51:32
All right, time to pick. Time to pick Max with the first

00:51:37
pick of year 3. I'm going to go with my gut here

00:51:41
even though this technically isn't the highest valuation on

00:51:43
the board and just go with the strongest vibes I think

00:51:48
available on the board and say cursor the coding application.

00:51:54
You're locked into that. You're locked into that.

00:51:56
I'm. Locked into cursor you wanted.

00:51:57
That I think that's insane. I'm I'm overjoyed and I'm like,

00:52:01
that was not one of the ones I was going to pick.

00:52:03
I just think that. I just think that my other

00:52:05
options are cursor competitors, essentially scale AI

00:52:09
competitors, questionable foundational model companies.

00:52:14
You know, I don't know, I just the heat on cursor is so hot

00:52:18
right now so I'm happy to hear the negative case now that I

00:52:20
picked it since Eric clearly has a strong.

00:52:22
I just think it couldn't be hotter than it is.

00:52:24
Yeah, I mean, but that's what we said.

00:52:26
About but it's like 20 billion, right?

00:52:29
Why do we have? We have a listed here at, not.

00:52:31
Yeah, okay. I think it is 20, so maybe it is

00:52:33
number one right now. Yeah.

00:52:35
I mean, has there not been a lesson from this draft other

00:52:38
than a draft the highest valued company for.

00:52:41
The first pick, certainly, yeah, we're going for market caps,

00:52:44
not, not a sort of climb. So or two for two.

00:52:48
On the highest valued company being the best pick in

00:52:51
retrospect so far. So I don't know.

00:52:53
I can't. I can't, can't talk myself out

00:52:55
of taking. It, it is, it is rumoured at

00:52:57
around 30 billion. It's just not raised 30.

00:53:00
It's not raised at that level. So it's Max.

00:53:06
It's a good Max. Remember when the last two years

00:53:09
you've told us like that we were dumb for picking all the hypiest

00:53:12
companies and. No, I wanted xai.

00:53:14
I wanted xai, I wanted, I wanted XA clear, definitely.

00:53:18
Pull the tape. I I wanted xai.

00:53:21
So don't. Don't back.

00:53:22
Don't. Retcon that.

00:53:24
If Tesla and XAI ever merge, we're going to have like a big

00:53:27
challenge in evaluating that that I also believe.

00:53:30
That I Yeah, yeah, I believe I bid up Eric farther with open AI

00:53:35
than you did, James. So I was more.

00:53:36
Of a believer that's. True.

00:53:38
All right, so I'm next. With the next pick, Eric.

00:53:44
This is a hard one. Not so easy now, is it?

00:53:47
My go. It's hard to be the man in the

00:53:50
chair now. I am picking cognition labs.

00:53:57
OK. All right.

00:53:58
So you, so you literally. Spike for the Spike Pepsi Dick.

00:54:02
Or a joke. They're they're they're like you

00:54:05
bet What? You went to Pepsi, I picked Coke

00:54:08
and you picked Pepsi and you were shitting on.

00:54:10
My I'm betting on this is fucking Red Bull.

00:54:13
This is the next generation. They're like doing the new

00:54:15
thing. You guys are stuck in the pads

00:54:17
trying to sell sugar water, you know?

00:54:19
Like yeah, wait, how is Devin Red Bull?

00:54:25
Please explain. Just because it's like they've

00:54:28
got a new, they have a new growth strategy, they're doing

00:54:30
new things. I'm just saying it's like it's

00:54:32
not Pepsi. I mean, obviously, you know

00:54:34
what, Pepsi, I assume is bigger than Red Bull.

00:54:36
I don't actually know, but I'm just saying, you know,

00:54:38
technicians got some juice. They got new things going there.

00:54:41
They've, I feel like there's upside.

00:54:43
That's what I'm trying to say. They are the new, new coder on

00:54:46
the block. Even though they've been around,

00:54:47
it feels like, I feel like their momentum is is gonna be hitting

00:54:51
over the next 12 months. All right, sounds good.

00:54:55
Good pick. Good pick.

00:55:00
I have some tough choices to make here For my first pick.

00:55:06
You get 2 picks, I get 2 picks. Yeah, that actually helps a

00:55:08
little bit. We do a snake dress.

00:55:10
Yeah, yeah. I'm going to take for my first

00:55:13
pick Mercor which is the data labeling startup that.

00:55:18
That was my second choice. Isn't, isn't that also basically

00:55:21
a scale AI client? It is, yeah.

00:55:23
I'm OK, All right. Which did exit at $30 billion,

00:55:27
right? Get me wrong?

00:55:27
And and apparently when Windscale kind of got Aqua hired

00:55:31
by Meta, like a lot of business seems to have gone to Mercor

00:55:35
like because all the other labs were like, I don't want to send

00:55:37
all of our data to Meta. So just raised I think in

00:55:42
October, a $350 million round is my what my research says here.

00:55:48
And then valued at close to 10 billion.

00:55:51
So yeah, going with the the data labeling play, which is another

00:55:55
hot area of AI right now, definitely could come up as a

00:55:58
theme in the conference. I'm I'm happy either way with

00:56:01
what you go. I I think I'm I'm feeling you're

00:56:04
making my life easy for my next. Pick Really.

00:56:06
You're like but but. Because I don't have to make the

00:56:10
decision I'm saying. I think there are two obvious

00:56:13
ones to choose from. No.

00:56:14
And you're saying that I haven't choose one or the other, OK.

00:56:18
Yeah, and I don't have to. I'll just get the other one.

00:56:21
OK, well let me take a look here for my second pick.

00:56:25
I'm like, nervous. I take this very seriously.

00:56:31
I'm gonna have to go with thinking Machine Labs.

00:56:33
I just think it's another foundation model play and who

00:56:37
knows? Those are those are the only

00:56:39
ones that seem to matter in this draft in terms of potential

00:56:43
upside. Ride with and the insane acquire

00:56:48
potential there is. True, true.

00:56:50
And there seems to be a lot of true believers there because

00:56:52
like, it seems like Mark Zuck can't, can't poach anyone for

00:56:58
10s of billions. He got, he got 1.

00:56:59
He got, oh, he eventually got that guy.

00:57:01
Got the founder of. Thinking Oh yeah, yeah.

00:57:03
Yeah, that's right. Yeah, yeah, yeah.

00:57:04
No, he grabbed the founder of. Thinking fair enough.

00:57:08
The Yeah. And nobody knows what they're

00:57:11
doing, but yeah. They come out with interesting

00:57:14
research it seems like. Yeah.

00:57:17
But I agree it's a little bit of a hot just a Mira.

00:57:21
Murati, CTO of Open AI, founded this company.

00:57:25
Andreessen Horowitz is now a big backer.

00:57:27
But it's totally mysterious. And when this thing goes to 200

00:57:31
next year, you guys can't say that I got a steal cuz you had

00:57:34
the shot at this, you had the shot at this.

00:57:36
So I'm excited. I already have the ILIA 1.

00:57:39
So on one hand I was like, oh, safe super intelligence, have

00:57:42
thinking machines and have, I don't know, the pair of

00:57:45
speculative super hyped, you know, but.

00:57:49
It made it a little easier to pick right after Mercor, cuz I'm

00:57:52
like, all right, Mercor, we got real revenue and a real business

00:57:55
and something's happened in there.

00:57:58
But yeah, passing it back to Eric.

00:58:01
I'm back. It's back to me.

00:58:02
Yeah. All right.

00:58:03
This is a good one. I think you guys know what I'm

00:58:05
gonna pick, right? I mean.

00:58:07
I mean, I have a guess, yeah, but I don't want to this this.

00:58:09
Founder can eat the fucking world.

00:58:11
I sit down with him and I'm like what business aren't you going

00:58:14
to destroy Guillermo Co Versal I'm I'm picking up versal.

00:58:19
They first of all, even if they don't succeed in vibe coding

00:58:23
themselves, they are like doing a lot of the infrastructure work

00:58:27
for a lot of the vibe coding. So it's like lovable winds fine

00:58:30
like bill still use versal. So I I think Versal is like in a

00:58:34
cool position and I think Guillermo is a very talented CEO

00:58:39
and yeah, I think they're going, they're going place so happy to

00:58:42
pick up for a sell a sort of infrastructure play with, with,

00:58:46
you know, consumer aspects. Yeah, that definitely would have

00:58:49
been the next pick for me. Now it gets a lot more

00:58:53
interesting, I think, because I think there's a bit of a Cliff

00:58:57
here I get. Two, I'm lost now, yeah.

00:59:01
Yeah, I'm pretty lost, I mean. This is when it gets fun.

00:59:04
Yeah, I think thematically in the thinking machines category,

00:59:10
I feel like I, I just have to grab reflection AI here just

00:59:13
because open source foundational model company, it's American

00:59:18
apparently that's part of their story and we're going to have

00:59:20
some sort of national security defense by having another cool

00:59:24
foundational model company. I I don't know, it's got big

00:59:27
time backers, you know, Sequoia and DST and all these folks.

00:59:30
So, yeah, ultimately, I think just, I honestly know very, very

00:59:35
little about this, but yeah, I'm like, it's worth $8 billion and

00:59:39
it's foundational model company. Sign me up, Scotty.

00:59:50
Man, it's easy to be a VC. It's easy, especially when you

00:59:54
don't have to really pay. Like we're not like buy, you

00:59:56
know, we're not caring about the cost.

00:59:58
We don't worry about dilution. We only care about market cap.

01:00:01
So if you were in a VC some I think our incentives would be

01:00:05
different. I agree.

01:00:06
Like, OK, sure, I'm gonna get diluted like crime.

01:00:08
Do I want it at this price versus like am I?

01:00:11
Just trying. So the the way we set this up

01:00:14
incentivizes foundation models and it would have also

01:00:17
incentivized infrastructure companies.

01:00:19
Well, this is like this is a classic debate in fantasy

01:00:22
football, right? Is like you do snake draft or

01:00:24
you do auction draft where you actually have to like pay out of

01:00:27
your own budget and stuff like that.

01:00:29
So next, next, next time we can be a little bit more, a little

01:00:33
bit more savvy. Yeah.

01:00:35
OK. This next one I'm I'm actually

01:00:38
going to take what I consider to be a real risk on here because I

01:00:42
think we're starting to get into the risk part of the draft here.

01:00:45
Cerebra Valley alumnus here. I'm going to take lovable.

01:00:49
I just feel like the momentum around lovable as a brand name

01:00:54
or a meme or whatever the fuck you want to call it is just so

01:00:57
incredibly strong. And I don't necessarily think

01:00:59
the product is like that much better than a bunch of other

01:01:02
similar competitors or even ultimately cursor, you know, but

01:01:06
I think that it's just got that sort of like mimetic force and

01:01:11
growth. And I do think the founder is,

01:01:14
you know, has a lot of kind of narrative power around the.

01:01:17
Company you know I lit them up on Twitter they had the most

01:01:20
dark pattern unsubscribe no, I I don't and they're getting all

01:01:24
this shit on Twitter about like whether their growth metrics

01:01:26
make any sense I don't know no some red flags I was.

01:01:29
Giving them some of that shit. So maybe it's, you know, maybe

01:01:31
it's a dumb bet, but I do also think that like, probably there

01:01:34
will be a consumer brand in like make a website with an LLM and

01:01:38
maybe it will ultimately just be open AI and which point this is

01:01:41
all irrelevant. But they feel like they have by

01:01:43
far the strongest, like lead on being kind of the, you know,

01:01:47
consumer brand for vibe coding, for lack of a better term.

01:01:51
So yeah, I don't know, it's a bet.

01:01:53
I'm not in love with it, but I've got to take some shots.

01:01:57
And they feel like they have a very strong, you know, strong

01:01:59
memetic force behind them, I guess I would say.

01:02:01
And you're you're going to get a lot of picks That then is strap,

01:02:04
I think because Eric and I are really picking.

01:02:06
All right. I'm up and I can't give away all

01:02:08
the companies that I think are good left.

01:02:10
We're definitely deeper. It's less like, oh, just buy the

01:02:13
highest valuation now I'm I'm not willing to go for a there.

01:02:17
There's, you know, companies that I like sub 1 billion that

01:02:20
I'm like, Oh, you could bet on them, but I it seems risky.

01:02:24
I'm OK I'm embracing one of Max's themes, which is video is

01:02:28
doing better than voice. And I also do think, you know,

01:02:32
there it's more fun to this isn't like a fun business to

01:02:35
pick, but it's like selling to businesses is a good idea.

01:02:39
There's money to be made. That's sort of the glean case.

01:02:43
And so, and I really like the CEO of this company, Victor.

01:02:48
So I'm gonna pick Synthesia. Synthesia.

01:02:50
Yeah, okay, yeah. And it's also got the, it's

01:02:52
European undervalued sort of thing going.

01:02:55
And, and so yeah, I, I think, I think they'll go far.

01:02:59
I'm picking Synthesia. Well, they just leaked that they

01:03:02
turned on an acquisition offer for $3 billion, right?

01:03:04
I missed that. I didn't even know that.

01:03:06
Oh yeah, yeah. So they're definitely gonna

01:03:07
raise it like 5 or 6 pretty shortly.

01:03:09
Some great smart pick, smart pick, good value in the lower

01:03:12
part of the the draft if they're thinking about raising or you

01:03:16
know, close to 5, right. James.

01:03:19
James final pick. I'll pick.

01:03:22
I'm gonna go out on a limb as well here deep, deep in the

01:03:27
waivers pool right now, but in talks I believe to raise in the

01:03:33
2 billion range, Suno. Yeah, that was also pretty high

01:03:39
up there. I don't think that's deep in the

01:03:40
waivers is. It right, I mean right now if

01:03:42
you. They're they're, they're

01:03:43
obviously leaking all their ARR growth to close it around right

01:03:46
now. Yeah, I'm I'm just saying like

01:03:48
they their current valuation is sub a billion, you know, 500

01:03:52
million or something. Sure.

01:03:54
OK, so yeah, like yeah, there's been a lot of hype around them

01:03:58
in the last like week and they yeah, they leaked like 150

01:04:01
million or I don't know if they did, but they that leaked 150

01:04:06
million. Arr lot of discussion, obviously

01:04:08
a lot of potential IP risk, but every IP lawsuit these days

01:04:12
seems to be going in the way of the way of the the models.

01:04:17
So we'll see if that stands. But I think no matter what

01:04:21
they're really interesting acquisition or.

01:04:25
Spherical Valley alumni. Yes, I I I interviewed Michael

01:04:28
at Mikey at New York CVAI last year.

01:04:32
You looked into his eyes. And I was like, yeah, I want

01:04:35
this guy on my team. He's a, he's going places.

01:04:40
He was a yeah. OK, that's my pick.

01:04:44
Great. Nice.

01:04:45
All right, all. Right, James and I are done all.

01:04:47
Right, you guys are done. It's time to try to grab some

01:04:49
value off the. So how many do you have?

01:04:51
Off the ground ground here. Well I have 3 so far but I get 3

01:04:54
more. This, yeah.

01:04:57
Yeah, yeah, this is. Exciting.

01:04:59
There was some. This is a fun thing for you to

01:05:00
do. There are some.

01:05:01
This is my rebuilding opportunity here.

01:05:03
I did like Suno and Synthesia. They both would have been would.

01:05:07
Have been in the mix. Oh my gosh.

01:05:08
Yeah, Yeah. I literally have them in my

01:05:10
notepad ones, ones I wanted here for the first one.

01:05:15
I'm just going to grab one with a good valuation that we use a

01:05:17
lot at Volley because I think that I don't know, it's a solid

01:05:22
product and I have no idea if the business model is any good

01:05:25
or what the future holds here, but I'm going to I'm going to

01:05:29
take fall AIFALAI. Well, I've been hearing about

01:05:32
that one a lot. Interesting.

01:05:34
But it's not. I don't really know much about

01:05:35
it. They're essentially like a kind

01:05:38
of a clearing house or single API provider that lets you call

01:05:44
on any of the photo and video models.

01:05:45
So it kind of plays into this photo and photo and video theme,

01:05:47
which is like you're going to need to be able to switch

01:05:50
between these models really easily and grab the new ones.

01:05:53
And they take a little cut of that.

01:05:55
So it's almost like a marketplace for photo and video

01:05:57
models. And I don't know if that's

01:06:00
ultimately where value will accrue in the stack.

01:06:02
There's a lot of argument that their margins will be competed

01:06:04
to nothing or whatever. But like today, we at Volley as

01:06:08
a consumers company get a lot of value out of having like one

01:06:12
place to sort of call into for any kind of photo or image

01:06:15
generation. And so, yeah, I don't know,

01:06:19
feels like a. Invest in what you know, yeah.

01:06:23
Pretty highly valued at 4 billion as well.

01:06:25
Again, the opposite logic of normal investing, which is Oh

01:06:27
yeah, I want to take those high valuations.

01:06:30
Yeah, exactly right. I mean, there's this whole world

01:06:32
model startup theme that James and I are very familiar with,

01:06:36
which is this idea of like video, AI generated video that

01:06:39
you can walk around in and, you know, participate in as if it's

01:06:43
a video game. One of our good friends and

01:06:46
board member, Moritz buyer Lentz just raised $130 million for one

01:06:51
of these companies. But in this space, I'm going to

01:06:56
take, I think I'm gonna take World Labs, the Fei Fei Lee

01:07:00
startup. I think that it's just too much

01:07:04
name brand to avoid. I mean, I've seen D cart, I

01:07:11
haven't seen general intuitions product.

01:07:13
I ultimately think Demosus, Abbas and Jeannie will probably

01:07:15
win this whole thing or whatever, but maybe this is a

01:07:18
good acquisition or role up play at some.

01:07:20
Point what? What did you think about

01:07:21
Descartes? Why not pick Descartes over

01:07:23
World Labs? Yeah, I literally am just doing

01:07:27
this based on like name brand. And my assumption is that she

01:07:31
can raise a shit load of money and that might ultimately lead

01:07:34
to them winning the compute and the data war here.

01:07:37
Yeah, but obviously that could be totally wrong.

01:07:40
OK, last pick. I think the last one I'm gonna

01:07:45
take, and this is, again, probably just biased on what I

01:07:49
know, is I'm gonna grab Sesame, which is this glasses startup

01:07:56
that's focused on realistic voice assistance built into

01:07:59
smart glasses, which sounds terrible, terrible in a lot of

01:08:03
ways. I.

01:08:03
Would never pick that. They just raised $250 million

01:08:07
from Sequoia Capital, not nothing.

01:08:11
And I do think their TTS voice is quite good.

01:08:14
And I think that ultimately, it's interesting to see people

01:08:18
playing in the sort of voice and glasses space.

01:08:22
And they might get rolled up by either Mark Zuckerberg or Apple

01:08:26
or someone else like that at some point.

01:08:29
So it feels like just a, a real, a real Yolo play here.

01:08:35
And yeah, I don't know, it's just kind of fun and interesting

01:08:38
and I think the voice product is pretty good, so I feel like why

01:08:41
not invest in that? So all right, that's it.

01:08:44
Nice. Do you guys want to do it can.

01:08:46
We review. Quick review.

01:08:48
Sure. Yes, I will kick it off my

01:08:51
roster. Previously Databricks, Cohere,

01:08:54
Sierra and that was it. And I added a bunch of new

01:08:57
companies today. Cursor, Star of the Show,

01:09:00
Reflection, AI, Lovable, Fall, World Labs, and Sesame.

01:09:06
And Eric? I had Open AI, Glean, Mistral,

01:09:10
Perplexity, Safe, Super Intelligence, and Harvey and I

01:09:14
picked up Cognition, Versal, and Synthesia.

01:09:19
And I had anthropic replit XAI runway 11 labs poolside.

01:09:25
I just added Mercor, Thinking Machine Labs and SUNO.

01:09:30
Good drafts. Yeah, fun.

01:09:31
We have one more. Episode before St.

01:09:34
Roll Valley on November 12th, we'll be making predictions or

01:09:38
reacting to some AI predictions. And yeah, if you're in San

01:09:43
Francisco and startup founder or investor, reach out about

01:09:48
Cerebral Valley on November 12th.

01:09:50
Thank you for tuning into this week's episode of the podcast.

01:09:53
If you're new here, please like and subscribe.

01:09:55
It really helps the channel. We're building a YouTube

01:09:57
channel. I think you can tell we're

01:09:58
investing a lot more in our production and we appreciate

01:10:01
your support. And if you want the data insider

01:10:05
takes real reporting, go to newcomer.co and subscribe to the

01:10:10
sub stack as well. Thanks for following along.