The Artificial Intelligence Startup Draft
Newcomer PodNovember 14, 202301:04:2659.01 MB

The Artificial Intelligence Startup Draft

If you could amass any five artificial intelligence startup bets right now, which companies would you pick?

My Cerebral Valley co-hosts and I took a stab at answering that question with an artificial intelligence startup draft.

Our startup draft starts at 27:35 after a discussion of some of the biggest themes going into this week’s Cerebral Valley AI Summit.

The draft gave us a chance to dissect some of the most promising startups in artificial intelligence right now.

The goal was to amass five companies with the biggest valuation five years from now. We restricted ourselves to AI startups that had raised more than $100 million.

I encourage you to make your own prediction in the comments.

Give it a listen



Get full access to Newcomer at www.newcomer.co/subscribe

00:00:10
Welcome to the Cerebral Valley Podcast on newcomer I'm Back

00:00:14
with Max Child and James Wilsterman, my Cerebral Valley

00:00:17
Co host and the Volley Co founders.

00:00:20
We are very excited about this episode.

00:00:22
The meat of it will be a draft of the most valuable AI

00:00:29
companies we get. It is like, it's as close as I

00:00:32
think in Silicon Valley you can get to watching sports hopefully

00:00:35
or fan of. Sports.

00:00:37
Sports debate TV show. Yeah, exactly.

00:00:42
Yeah, I I overcomplicated it to make sure I got the.

00:00:47
I won't spoil it anyway, you know we took it very seriously

00:00:51
and even if all our listeners forget about it, I am sure.

00:00:55
James, what was the time window? It's five, is it?

00:00:58
Five years, Yeah. In five years we'll all be

00:01:02
checking to see see how we did. But before that Cerebral Valley,

00:01:07
the conference is on Wednesday. And you know, obviously for the

00:01:13
people going in person, you, you get why you're curious.

00:01:16
It's like OK, But then for everybody else, the videos from

00:01:20
the conference will be posted on YouTube and in the

00:01:23
newsletternewcomer.co. So this conference will be

00:01:27
brought to you digitally. We have not just hyped up a an

00:01:29
event you will not get to consume, you will get to see the

00:01:33
talks. So look forward to that.

00:01:35
I think usually we put some of the best ones in the podcast

00:01:39
feed and we just wanted to sort of take stock going into the

00:01:44
conference sort of big themes that that are going to be on our

00:01:49
mind and then what's changed from the first Cerebral Valley

00:01:55
in March. So yeah, let's get to it.

00:01:58
All right. I said you guys could could go

00:02:00
for themes last, I mean go for themes 1st and I would play

00:02:05
clean up so. James, why don't you go still

00:02:08
thinking about my themes? Yeah, I have noticed a theme

00:02:13
kind of gaining some heat over the last few weeks on Twitter.

00:02:18
Slash X of Kind of this big debate around open source

00:02:23
versus. Close source in terms of safety,

00:02:28
right. I think this ties in with the

00:02:30
executive order put out by the Biden administration and kind of

00:02:36
a raging debate I guess or simmering debate of whether

00:02:40
these larger close source language model companies are in

00:02:45
the business of regulatory capture through their efforts to

00:02:50
speak around model safety and get.

00:02:55
Get the Biden administration involved or whether that comes

00:02:59
from a true concern and how that impacts companies who are

00:03:05
putting out open source models for the community.

00:03:07
OK, I'm going to jump on that. Open source versus closed

00:03:10
source. I I think that's probably the

00:03:11
top in my mind. And in some ways I'd almost

00:03:14
frame it as like I think the first conference ended up having

00:03:18
more of the pro open source people.

00:03:22
You know we ended with Hugging face and Replet.

00:03:25
Both of them are very pro open source.

00:03:29
I mean we have data bricks back they're pretty pro open source.

00:03:33
Whereas this conference we're ending the day with Vinod Khosla

00:03:38
who has been somewhat critical of he he wants more sort of

00:03:43
control and regulation. We'll we'll get the exact nuance

00:03:46
of his open source take and then Mustafa Suleiman at inflection

00:03:51
and the D Mine Co founder who's going to end the day has been

00:03:54
super supportive of regulation and somewhat apprehensive about

00:03:58
open source. So I think this conference in

00:04:00
some ways will be the response to the really pro open source of

00:04:04
the first. Of course I I think open source

00:04:06
is very popular in the not open AI world.

00:04:09
So I wouldn't be surprised still if probably the majority of our

00:04:12
panelists are still pro open source or yeah, what do you guys

00:04:15
think? I guess one thing I wanted to

00:04:18
ask you guys on this topic is. Last time at the conference, a

00:04:23
lot of discussion was about the letter that just came out, with

00:04:26
a lot of signatories around taking a pause on AI

00:04:31
development. A modest facility had signed it.

00:04:34
Totally perplexingly, yeah. Yeah, which, which was?

00:04:38
Confusing to a lot of people. And I guess my question is, at

00:04:43
the time, it felt like there was a lot of heat on companies even

00:04:46
like Open AI and Sam Altman. For being cavalier with their

00:04:52
development of AI and causing, you know, an AI race and putting

00:04:55
out ChatGPT. And you know, before the bigger,

00:04:59
you know, more cautious companies like Google had

00:05:01
released their AIS, right? And I really think in like 6

00:05:05
months that has just totally changed where you know now that

00:05:08
you have this open source community putting out models.

00:05:11
Like Open AI is able to like look like the responsible

00:05:14
stewards of AI going to the White House and you know,

00:05:18
getting regulations in place. Just curious if you guys agree

00:05:22
with that assessment? I do think a lot of the safety

00:05:26
talk has died down. I guess the executive order is

00:05:30
maybe a contrary example of that.

00:05:33
Although I you know, maybe the White House timeline is like six

00:05:35
months behind everyone else's timeline, but.

00:05:39
It does seem like people are a lot less hung up on like is the

00:05:44
AI going to kill everyone and sort of the tech industry and

00:05:47
then I guess simultaneously we got this massive executive order

00:05:50
banning or not banning but regulating models over a certain

00:05:53
size etcetera, etcetera. So it's it is interesting to

00:05:58
think about if there's going to be this sort of waves and and

00:06:02
troughs or peaks and troughs of the the hype cycle around safety

00:06:06
and I feel like. Burn a little bit of a through

00:06:08
inside the industry right now, but six months from now that

00:06:11
could change again. Just got to keep having

00:06:13
conferences in March. GPD 4 was like, you know, a

00:06:16
shockwave and it had come over 3.5 and I think it felt like,

00:06:21
man, if this keeps going like we're going to have generalized

00:06:24
AI tomorrow and then I nobody is really caught up with them,

00:06:27
right? There are open source models and

00:06:30
people can say, oh, this is better than open AI on cost.

00:06:35
But I don't think anyone even in specialized use cases has really

00:06:39
had anything as smart. So then there's just a little

00:06:41
less fear of like, Oh my God, it's becoming like.

00:06:44
So you know, that is just not sort of the pace hasn't been so

00:06:49
out of control. And I think that's part of what

00:06:51
has cooled things down. But I sort of, I mean I think

00:06:54
the binding executive order is like a huge deal and shows that

00:06:57
there's very active sort of government interference and like

00:07:00
Europe has been talking about things.

00:07:03
So I don't know, I think this question of cracking down on

00:07:07
open source is very real. But yeah, but the open source

00:07:10
community has certainly been vibrant and and you know, and

00:07:14
the other piece of that is that firms like Andreessen and others

00:07:17
have been very loud about sort of positioning themselves as

00:07:20
super pro open source. Yeah, interesting.

00:07:23
Next theme. All right, I have a theme for

00:07:25
you guys to think about here. I think one of.

00:07:30
The all time maybe the number one all time Silicon Valley

00:07:33
investing philosophy for all categories of software is you

00:07:38
know we like to invest in companies that sell picks and

00:07:41
shovels, not the companies that dig for gold, right.

00:07:43
We like to, we like to fund the tool makers, not the tool users

00:07:47
in many cases. And I think what's interesting

00:07:51
just in the last six months between the previous conference

00:07:55
and the current conference is. The the sort of what tools were

00:08:00
in vogue 6-7 months ago and then what tools are invoked today.

00:08:04
And then the sort of cherry on top of that is Open AI itself,

00:08:09
you know, very, very recently announced this kind of huge wave

00:08:13
of first party tools improvements whether it's you

00:08:17
know, retrieval improvements, context increases, speed, cost.

00:08:22
Really seeing anyone able to make their own GPT a character

00:08:26
to kind of go after the character AIS of the world.

00:08:28
Like they really came hard on like a bunch of tools categories

00:08:32
that I would say 6-7 months ago were being funded as like pretty

00:08:34
compelling stand alone companies.

00:08:36
And I think it's sort of a A2 way question which is 1.

00:08:42
Is tools investment still good idea in an AI world like or is

00:08:46
open AI going to eat this all alive and then I guess 2 like?

00:08:50
You know who who is, who is still exciting in the tool space

00:08:54
and and who do we think has like a strong foundation for the

00:08:56
future. Yeah, I mean I think that

00:08:59
question of will Open AI just eat everything applies in every

00:09:04
part of the. Yeah, I mean that was a theme of

00:09:07
the first conference definitely, just like how how relevant is

00:09:11
everybody except Open AI and and sort of an ongoing question.

00:09:16
So certainly, yeah, I think now with the developer conference,

00:09:20
this thread on tools and then I think you know, given ChatGPT is

00:09:26
clearly the most popular consumer AI app.

00:09:28
Well, I'd be interested. I mean, you have authentic

00:09:30
conversations with venture capitalists regularly, Eric,

00:09:33
right behind the scenes that don't all get published.

00:09:35
Like, do you ever get some authentic expression of concern

00:09:41
about tools companies that they've backed in the space that

00:09:43
were like, well, maybe we. Jumped on this tools idea too

00:09:47
early and it actually isn't a standalone product or you know

00:09:51
behind the scenes there are definitely a lot of firms that

00:09:54
are shitting on foundation model bets.

00:09:56
Oh, interesting. Oh yeah, that's that's like, I

00:10:00
mean they're expensive you know they're very speculative.

00:10:03
Open AI is in the lead. They feel like even with open AI

00:10:07
open source comes to parity very quickly.

00:10:09
So the, the category I hear the most dunking on is just like

00:10:13
foundation models and you'll you'll see firms that haven't

00:10:16
made a lot of bets there. You know, yeah, the Valley loves

00:10:20
like tools, picks and shovels. So that's not something where I

00:10:25
feel like I've heard a lot of regret.

00:10:26
And if anything I feel like I've been, I mean I've been talking

00:10:29
like vector databases all week and stuff like that.

00:10:32
And literally you're getting VCs talk about chips or getting

00:10:35
companies around chips, which traditionally they've been

00:10:38
scared about. But you have sort of the fact

00:10:41
that NVIDIA is doing so well and then you also have the fact that

00:10:44
it's sort of cool to be like an emerging tech investor at the

00:10:47
moment. So I think those things

00:10:48
combined, you're seeing that. So no, I I think V CS dunking on

00:10:54
foundation models, but bullish about everything else supporting

00:10:57
AI. Yeah.

00:11:00
Are there particular company types you have in mind like do

00:11:02
you think? Well, I mean, I think there's

00:11:04
like obvious companies that are like.

00:11:06
May be less hot than they were in the last conference, right?

00:11:08
I mean like Jasper. Jasper would be an example,

00:11:11
right? Stability would be a huge

00:11:13
example, right? But I think there's company

00:11:14
stability that's just like the unique failings of.

00:11:17
That sure, sure. But there are companies that we

00:11:19
talked to last time, right? You know, Lang Chain, for

00:11:21
example, right. That like it's a big question as

00:11:24
to where Lang Chain fits in the universe, where Open AI is just

00:11:27
going to take care of all this crap, right?

00:11:29
And then? You know, I think we're talking

00:11:31
to some companies this week that, you know, look like they

00:11:33
they could totally be awesome companies.

00:11:35
And I have no earthly clue, but they're providing very

00:11:38
specialized tools like you know, data cleaning or something like

00:11:41
that, right. Like which again I'm excited to

00:11:43
hear the pitch like of of why that's a a great idea for the

00:11:45
company and then probably is. But like, you know, I would be a

00:11:49
little scared that opening. I was just going to like kick

00:11:51
ass in my category if I were a tools company.

00:11:55
And so I just think, and I mean, I think you have to agree that

00:11:57
character has to be pretty upset that Open AI basically announced

00:12:01
their version of character AI tools, right?

00:12:03
Like, so I'm just interested if there's any real feedback of the

00:12:08
fear factor on that front. But it sounds like maybe not.

00:12:10
So yeah. Do you think just jumping diving

00:12:15
in on the character question like there is a theme I've

00:12:18
noticed of people kind of questioning whether?

00:12:22
Open AI for consumers as an end point is is really sufficient.

00:12:27
So there's kind of like the opposite side of that question

00:12:30
from the tooling side, right? Like will they eat all tools?

00:12:32
And then there's from the consumer assistant side, is

00:12:35
there a super assistant that's just ChatGPT or do you end up

00:12:38
using all these characters or mini GBTS that that they are

00:12:43
working to enable developers to build?

00:12:46
How do you guys feel about that debate?

00:12:48
Yeah, I mean, I think that's, that's the open question.

00:12:51
I think it depends. It depends on the quality of

00:12:54
the, you know, the foundation. Models.

00:12:57
I don't know. I mean, I guess like you

00:12:58
selfishly we built games, James, that are built on AI tools,

00:13:01
including in some cases large language models.

00:13:04
So my selfish belief is, is possible to build games on LLMS

00:13:09
without open AI building the exact same thing.

00:13:12
It may depend on the complexity level of the consumer app built

00:13:16
on top of the tool, whereas character is sort of.

00:13:19
Pretty vanilla RE, you know, re prompting or you know tweaked

00:13:23
version of the core, the GPT experience.

00:13:26
But you know, I think it depends on the complexity level and sort

00:13:29
of how much of the stack you control and and how easy it is

00:13:32
to get to your customers. I mean a phrase I've been

00:13:35
hearing over and over again this week, you know, just talking to

00:13:38
VCs has been retrieval augmented generation.

00:13:42
Rags, Rags and I think there are a lot of companies right now

00:13:45
that think we can be really good at giving GPT the sort of

00:13:51
information it needs and then you know Open AI or whoever can

00:13:57
sort of do the thinking based on that.

00:13:59
But we're not, we're not in the business of making the best

00:14:01
thinking company, but we're we're really good at figuring

00:14:04
out like what it needs to consider and and it's possible

00:14:08
that there are a lot of what. Kind of companies do you think

00:14:11
are saying that around like I guess this is kind of.

00:14:15
Pointing to data and unique data sets as kind of a mode of some

00:14:19
kind, right? Like, what kinds of companies do

00:14:22
you think fit that category of potentially having a true data

00:14:26
mode that could be valuable, having a data mode?

00:14:31
Yeah, Well, well, this isn't a data mode, but I mean, I'm one

00:14:38
company as you'll hear in the second.

00:14:40
Isn't it a data? Isn't it a data mode that you

00:14:43
are saying that? These companies are talking to

00:14:45
you saying we have well, like OK, there's this company clean

00:14:49
right, that lets companies run searches, right?

00:14:52
Is it a data mode? They're really like good at

00:14:54
hooking companies up, companies data up to data sources that

00:14:58
then connect, that use rag and then plug into something like

00:15:03
Open AI. So it's not necessarily like a

00:15:06
Bloomberg where it's like we have all the financial data in

00:15:08
the world. It's we're really good at like

00:15:10
getting the relevant data four companies into into this system

00:15:14
is is 1 case. But at least at Open AI Dev Day

00:15:18
this week, they seem to be going after some similar use cases of

00:15:23
allowing companies and enterprise to build their own

00:15:26
internal GPTS or chat GPTS based on internal company data.

00:15:31
So I think that that is still some area that they seem to be

00:15:35
competing in as well. I mean, you know, eventually

00:15:39
there's only so much one company can do unless, unless it's like

00:15:43
the, I feel like James, this is your style of suggestion

00:15:46
throughout this series. Unless there's AGI and like

00:15:49
every way we talk about this is wrong.

00:15:51
But in any sort of traditional business sense where like you

00:15:55
know, Sam Ullman, you know, has joked in the past that like, oh,

00:15:58
we'll just ask open it, we'll ask Chad GBT at some point, you

00:16:02
know, like what our business model should be and we'll do

00:16:04
that. Clearly they didn't need to wait

00:16:06
for that. They figured out some business

00:16:08
models. But you know, I anyway, my my

00:16:11
point is just like in a traditional, normal world that I

00:16:14
expect to continue, You can't do everything And so whatever,

00:16:17
they're going to have to pick their strengths and then

00:16:20
everybody else can do other stuff.

00:16:24
I'm going to, I'm going to give a last, a last theme that's sort

00:16:28
of more in the fun sort of philosophy sort of world.

00:16:32
You know, fitting into that AGI question which is just how

00:16:36
seriously do people take existential risk in a, you know,

00:16:42
is existential risk now is seen as like this is propaganda for

00:16:47
people who want to make it 1 seem really awesome and sexy and

00:16:51
amazing and can do everything. And then two people who want to

00:16:54
be in this world of like shutting it down, Like how much

00:16:57
is existential risk really on people's minds anymore?

00:17:01
Or is it much important that we just make sure it's not racist

00:17:05
and make sure, you know, deal with these like, very practical

00:17:08
concerns that are obviously true now versus imagining this world

00:17:12
where it's. You know, you know, like our AI

00:17:15
Doom episode. You know, figuring out how to

00:17:17
optimize paper clips into blowing up the.

00:17:19
Earth. That's kind of alluding to my

00:17:20
earlier point about I feel like Silicon Valley has stopped

00:17:23
taking safety as seriously, right.

00:17:24
I mean, like, I think that like, yeah, sure, the executive order

00:17:27
is not an example of that, right?

00:17:29
But in Silicon Valley, I think the discussion of, to your

00:17:33
point, the the implied seriousness with which people

00:17:37
take. Existential risk seems pretty

00:17:39
low to me because otherwise people would be like blowing up

00:17:44
GPU factories or whatever. I don't know.

00:17:47
It's like, if you actually believe this, had a 20% chance

00:17:50
of killing all of society, which a lot of people say, like how

00:17:54
could you morally work in this space?

00:17:57
Or at least not like, you know, be actively opposed to any of

00:18:01
the development here, right? And and the only people who seem

00:18:04
to really act as if they truly believe existential risk.

00:18:07
Is real is basically you know, Elisa Yodkowski and the sort of

00:18:10
the AI doomers on Twitter, right, who are really out there

00:18:12
every day saying there was, there was an interview.

00:18:15
Shout out to Logan Bartlett with his podcast.

00:18:19
He interviewed Dario, the Co founder.

00:18:22
Of anthropic, right? And Dario gave like it was like

00:18:25
20% chance, like everything. Goes there was like 10 to 20%

00:18:28
chance that was like really high I think.

00:18:31
Is that like in the next 100 years?

00:18:32
So And again the time frames matter right?

00:18:35
Like. I think it was.

00:18:36
It was reasonably soon. But yeah, I I don't know.

00:18:39
I sort of think you'd have to be insane to be working in an

00:18:43
industry where you truly believe that progress there had a 20%

00:18:47
chance of ending all humanity. I I don't know.

00:18:49
Like, unless you believe in some complicated.

00:18:52
Game theory about deterrence or something.

00:18:53
And back to our earlier episode about, you know, the only thing

00:18:56
that can stop a good guy with an AI, or sorry, a bad guy with an

00:18:59
AI is a good guy with an AII. Said it wrong.

00:19:01
Who coined that line? Did I get that line?

00:19:03
I don't. I can't remember but.

00:19:05
But the only thing that can stop a bad guy with an AI is a good

00:19:08
guy with a better AI, right? Yeah, unless you can explain

00:19:11
some pretty elaborate game theory on how progress in AI by

00:19:16
the good guys is necessary to stop extinction.

00:19:19
It just, you know, it doesn't compute that you believe this

00:19:21
stuff has 20% chance of killing us like you know, and be working

00:19:26
in the industry at the level of Dario, who literally is like the

00:19:29
number two most powerful person in foundation models, right?

00:19:33
I mean it's just, it's nonsense. So I don't know.

00:19:36
I agree it it doesn't seem like people take it that seriously

00:19:39
for real. As we discussed in that episode,

00:19:42
though, there's a lot of, like, very bad things that could

00:19:45
happen that, you know, AI kills some of us, right?

00:19:48
Not. All of us, Yeah.

00:19:49
That was the that was. The that was the take away all.

00:19:52
You're right, right. Maybe it doesn't.

00:19:53
Like, I guess like maybe maybe people have come around to sort

00:19:58
of a middle ground here where it's not existential risk, like

00:20:02
extinction for humanity and more, that there's a lot of bad

00:20:05
things that could happen over the next 5 to 10 years with AI

00:20:08
and I think all of us. Would agree with that, right?

00:20:12
You know there are bad things that could occur if the, you

00:20:16
know, some bad actors got access to some really powerful sure

00:20:20
models or something, right? I mean.

00:20:22
I think you the fact that Cruise is not active in San Francisco,

00:20:27
like even if they sort of bungled their disclosures to the

00:20:30
government, I mean Cruise is a better driver than humans in San

00:20:35
Francisco, right Like. They really they they.

00:20:37
They broke the one rule, which is don't lie to the government,

00:20:41
right? I mean, like, it's just like,

00:20:43
you can be, you can be bad, you can screw up, you know?

00:20:47
But don't lie to the government is like freaking like #1 on the

00:20:51
10 commandments of running a company, right?

00:20:54
I mean, like, you know, that's why SPF travels or whatever, You

00:20:57
know, like, I mean, like like just don't lie to the

00:21:00
government, like, you know? Fake it so you make it can apply

00:21:03
in many scenarios, but not with the government.

00:21:05
There, you can't do that with them.

00:21:08
So anyway can we just go back to this for a second?

00:21:11
So you're you're what are you saying Eric about existential

00:21:17
risk or Max like you you guys are saying companies are not

00:21:20
acting like they believe their own hype or do you agree with

00:21:24
that or? I think, I think people are now

00:21:26
saying calling bullshit in sort of private.

00:21:30
Existential risk. And some of them don't want to

00:21:32
say it publicly because they don't want to be seen as sort of

00:21:35
undermining legitimate concerns. But I think a lot of them are

00:21:39
just like doesn't feel close and then how do you feel that ties

00:21:44
into the executive order? Does that feel to you like over

00:21:50
regulation or you know, pre? Premature.

00:21:53
I'll be sort of waiting until after the conference to stake

00:21:56
out my position partially, just like Get.

00:22:00
Get more information And also, you know I sort of shift between

00:22:06
information gatherer and sometimes opinion haver.

00:22:09
And I don't know. I've been sort of waiting to

00:22:11
come out strong, though I do have a sort of.

00:22:13
Side the last theme I'm actually interested in, and this is very

00:22:17
close to home for James and I in the last couple of months, is.

00:22:21
We are increasingly discovering that the cost of running any

00:22:24
large language model products at consumer scale is unbelievably

00:22:29
high. Like, oh, interesting.

00:22:31
Yeah, is so high that it's it's very hard to come up with a

00:22:36
mental model for how you could build anything on top of a GPT

00:22:40
for quality model right now that would be distributed to

00:22:43
consumers. You could maybe do it on 3/5.

00:22:47
Especially with some recent price cuts with, you know, fine

00:22:50
tuning, yadda yadda yadda. But I would say that it is

00:22:54
shockingly high to run a very basic application that's GPT 4

00:22:58
level quality on LLM. And I think it basically makes

00:23:02
it impossible to build consumer apps on these level of models

00:23:06
today without incinerating money.

00:23:09
And so I think it's an interesting question as to

00:23:13
whether or not a lot of these companies that have raised 100,

00:23:15
two $100 million dollars. I have to imagine their unit

00:23:19
economics are pretty much upside down if they're building any

00:23:21
sort of consumer product. As in every time someone uses

00:23:25
their product they lose more money than they make.

00:23:27
So I wonder if the only outcome of this is that B2B companies

00:23:34
can maybe afford this because they can charge such such

00:23:36
staggering prices. And then?

00:23:39
Open AI is like the only one that can do consumer stuff

00:23:41
because you pretty much have to have no middleman to afford

00:23:44
running these applications. OK.

00:23:46
So that's a great one and I'll broaden it and just say I think

00:23:49
there's an the overall meta theme of that is the rubber

00:23:53
meets the road, right? There's this.

00:23:55
Do people really care about existential risk anymore?

00:23:57
Because it's like, oh, we're just in the practical problems.

00:24:00
Like it's really expensive to run consumer apps and I've heard

00:24:04
people in enterprise ask like are.

00:24:07
Are big companies, customers sticking around or do they try

00:24:11
the demo and say we didn't like it anymore?

00:24:14
And I mean, you know, Chachi, BTI think we're going to see in

00:24:17
a slide during the presentation, you know, activity syncs and

00:24:20
then they have new features and goes up like what's the

00:24:23
resiliency? So just this, yeah, the rubber

00:24:25
mean the road, all right, We've gotten all the hype like how are

00:24:29
people delivering? And now it's sort of the moment

00:24:31
where it's turning into sort of more regular sort of business

00:24:35
style questions. I mean, this stuff is so

00:24:39
expensive to run that it's hard to imagine you would not have to

00:24:42
run into regular business style questions pretty quickly.

00:24:45
Now I was not in the Internet industry in 1993 when you had to

00:24:48
go buy your own servers or on a website, you know, so that this

00:24:54
may be analogous to that time, you know, where you had to be a

00:24:57
garage band server farm to afford to run these kind of

00:25:00
companies. But even that would have been

00:25:03
that would have been fixed cost, right, I mean now we're talking

00:25:06
about. Like we're talking erasing cost.

00:25:08
More customers you get, more expensive.

00:25:09
Yeah, Erasing software margins like.

00:25:13
So like, I'm pretty interested to see if anyone can run a

00:25:17
consumer product without just incinerating money besides Open

00:25:22
AI. So.

00:25:24
Stay tuned or or or use I would say.

00:25:26
I would say the last caveat would just be using smaller

00:25:28
models or. Yeah, using smaller smallers or

00:25:31
fine tuning or something. Or GBT 4 gets 20 times cheaper

00:25:35
in the next year and a half. And then this is a dumb,

00:25:39
irrelevant, you know, retrospective discussion, but it

00:25:41
really has to go down by like a 10X to be interesting.

00:25:44
So they did three XA week ago. We need two more of those to

00:25:49
really be in a good category of cost.

00:25:53
Yeah, we, we should. I mean, that is a good reminder

00:25:55
that we should ask money. Like, how much does this cost?

00:25:59
Like, how? Yeah.

00:26:01
How Where do you make money if like, you know, prices change

00:26:04
one way or the other. Yeah.

00:26:05
Yeah. All right.

00:26:07
Next up, I really think you're going to want to listen to this,

00:26:11
the draft pick. Please give us your feedback.

00:26:14
I want insane like memos about like what the order strategy

00:26:19
should have been, and just feel free to weigh in.

00:26:23
We did, you know, I I think we explained all the particulars in

00:26:26
in James, right. You talked through Yes, yeah,

00:26:28
we're about to jump into the rules and there were, there was

00:26:32
probably more time spent behind the scenes debating the rules

00:26:35
than I preparing for the draft. I was insistent about a

00:26:38
particular rule change that I mean you'll, you'll hear.

00:26:42
So we had a lot of fun. Sent you Eric at newcomer.co.

00:26:48
You can always send me your opinions about who's an idiot I

00:26:52
hopefully. I think I'll try and post the

00:26:54
list. So please forgive us that it was

00:26:57
a set list. There are companies.

00:26:59
I asked if I could pick companies off list as sort of a

00:27:03
insider with Intel. But I was I was not allowed to

00:27:05
pick off list. So I you're restricted to a list

00:27:08
of companies and we didn't pick you, please hire a new marketing

00:27:12
officer. I don't know.

00:27:13
Sorry. Yeah, find find out why your

00:27:17
your valuation was also not, you know, listed in Pitchbug, right

00:27:21
'cause the list was selected from companies who are or who

00:27:26
have have a recent over $100 million rounds in Pitchbug.

00:27:32
There's go. Listen to that.

00:27:37
Welcome. Welcome.

00:27:38
All right. We are gonna move into the draft

00:27:41
we this is probably the thing we're most excited about at

00:27:45
least informed to do taking a page out of.

00:27:50
Sports Podcast, Political Podcast Playbook and we are

00:27:54
going to do a draft pick of the top AI companies.

00:28:00
James, do you want to explain the metrics for who wins this

00:28:05
and how we know and all of that? Here's what we're doing.

00:28:08
Max, Eric and I are drafting teams of AI startups.

00:28:13
The startups must have risen over $100 million.

00:28:18
I wish that wasn't the case, partially because I have Intel

00:28:20
on like companies that are promising, but James isn't

00:28:24
letting me include random. I feel like there's, you know,

00:28:27
it's a risk. They are not even valued at that

00:28:30
yet, but I I, yeah, I think we have, we have 43 companies in

00:28:35
the list and we're only picking five each.

00:28:39
So many of these will not make it on the board.

00:28:42
So I'm excited to see what happens, but this really shows,

00:28:46
I think, the amount of companies who have raised ginormous rounds

00:28:51
in AI. Is this 100 million plus list

00:28:54
you have to have raised raised over 100 million, not valuation

00:28:57
raised. It's probably 100 million

00:28:59
because I I do think also they're probably ones that

00:29:01
aren't. So some other criteria I used to

00:29:05
kind of narrow down this list. You had to be using generative

00:29:09
AI or the company had to be have generated AI or generative AI

00:29:14
infra kind of core to it. I kind of excluded some

00:29:17
industries like bio and healthcare, defense, silicon

00:29:22
chips, robotics, driverless, or a VS and China.

00:29:26
I just wanted to filter down to companies we might have a chance

00:29:30
of knowing something about or being predictive about, rather

00:29:34
than just predicting whether some biotech company is going to

00:29:39
use AI to discover a new drug or something.

00:29:42
And I also wanted companies founded before excluded,

00:29:47
companies founded before 2013. So most of these companies have

00:29:51
been actually founded in the last three years or something,

00:29:54
but there are a few who have been founded maybe earlier.

00:29:58
So that's the game and we are going to determine who goes

00:30:02
first by creating a handicap auction.

00:30:06
So this was, I was adamant, yeah, that we do this.

00:30:09
James and Max were like, oh, you know, that's part of the

00:30:12
randomness. But to me, like a huge part of

00:30:14
this is how how much would you pay to get to pick?

00:30:19
Open AI Because I, I, I honestly think we're all going to pick

00:30:22
Open AI. We should talk about what the,

00:30:24
what the, what the grading criteria is.

00:30:27
Yeah, here. So the way we are drafting these

00:30:31
teams, we are we are trying to create a team with the highest

00:30:35
total valuation on November 1st, 2028.

00:30:40
And final valuations would be determined by pitch book, either

00:30:44
kind of a reported M and a price or a current stock price if if

00:30:48
the company is public or the last evaluation if the company

00:30:52
is still going but is private and if the company is shut down,

00:30:56
I think that would count as a 0 for that team.

00:31:00
If if like 2 year, does it have to have raised at the price

00:31:04
within the last two years or something because?

00:31:09
There's whatever the last fundraise was.

00:31:11
Yeah, just. But like, it might not be worth

00:31:12
that anymore. One question I have is let's

00:31:15
let's say there's a company on this list who has raised, who

00:31:19
has raised a huge round, maybe in the Zerp era but never raises

00:31:23
again, but is a zombie company in five years.

00:31:27
Are we counting that at their? I think the last valuation.

00:31:31
No, it needs to raise another round or do something in the

00:31:35
next five years because otherwise there's a strategy of

00:31:38
just like picking the most highly valued companies.

00:31:41
And just betting that like nothing happens.

00:31:43
That's the efficient market hype.

00:31:44
Out there, I don't think. I think it's theory, current

00:31:47
evaluation, other. Is the best particular of future

00:31:50
value. It needs another either of a, it

00:31:52
needs another fundraiser. Reporting.

00:31:53
Yeah, I think so. Max, are you OK with that or.

00:31:57
I mean, I don't care. I don't care.

00:31:58
Yeah. Yeah, yeah.

00:31:59
So whatever. It needs something, whatever,

00:32:01
right? Are we agreed on that?

00:32:03
Yeah, we're agreed. So if the company does not raise

00:32:07
any round between now and November 1st, 2028, yeah,

00:32:12
according to Pitch Book, a new round, they will count as a zero

00:32:16
as well. Fair.

00:32:18
OK. Yep, brutal.

00:32:21
So how are we doing the bid for 1st and 2nd place?

00:32:28
Well, I think the the idea here is if you you know we're gonna

00:32:34
just go down the line here and name a price that you are

00:32:38
willing to subtract from your total score and then in the end

00:32:42
and then determine you know the order in that in that order of

00:32:47
those 3 handicaps and we will deduct those scores at the.

00:32:51
End. All right, let's just do a live

00:32:52
auction. Like whoever gets the highest

00:32:54
price and then we stop it. OK, so then it's a coin flip.

00:32:57
OK, fine. Coin flip for a second?

00:32:59
Yes. OK, let me pull up my digital

00:33:02
coin flipper. OK, All right.

00:33:06
And then it's a snake draft. Yes.

00:33:08
All right. Let's let's start.

00:33:10
I I bid -45 billion. For open AI.

00:33:15
For first for to go first. For first take, OK, I'll go

00:33:26
I'll, I'll go to 50. Sure, I'll go.

00:33:30
I'll go 55. I'll go. 60 60 billion -60

00:33:37
billion. How is opening eye going to have

00:33:40
an exit? I just am confused by this.

00:33:42
It does they they don't need to have an exit, they just need to

00:33:45
have a fundraise. OK.

00:33:46
I'll go. I'll go to 70. 75 OK, that's

00:33:58
fine. Max, you're done.

00:33:59
I'm done. I'm out.

00:34:00
You got it at 75, all right. 1st pick You have a lunatic open AI.

00:34:05
OK, bad pick. Bad pick now I.

00:34:11
Imagine I did all that to get like anthropic or?

00:34:13
I just, I just I just think the number of companies that are

00:34:16
worth like hundreds of billions of dollars on the secondary

00:34:19
market is just extremely low. So you're you're basically

00:34:21
discounting the net equity value to like 5 or 15 billion now or

00:34:24
whatever it is. So sorry, yeah, added.

00:34:28
Yeah. It's currently valued at 80 or

00:34:29
90 right on secondary markets and.

00:34:31
You took 7075 off that. So you're basically saying I'm,

00:34:35
you know 1010 is where I my starting point and I have to get

00:34:39
another 100 billion or whatever on this valuation?

00:34:41
If it's, is it a trillion dollar company like?

00:34:45
Yeah. Then I crushed it like.

00:34:47
But are they going to be a trillion dollar company as a

00:34:49
subsidiary of Microsoft? I just don't believe that.

00:34:52
But that's fine. This is why it's an exciting,

00:34:54
contentious pick. Yeah, great.

00:34:56
OK, All right. Eric's going to go 1st, and Max

00:35:00
and I are flipping for 2nd place.

00:35:02
Max, name your head or tails here.

00:35:04
I'll take heads. Heads.

00:35:06
You're up second. OK, Eric, we know what you're

00:35:08
picking. Do you want to make it official?

00:35:10
Open AI Congratulations knew what he was doing.

00:35:15
Smart guy. Smart guy.

00:35:17
Anything you want to, Anything you want to add there, or.

00:35:20
Well, I I do also think the fact I think Microsoft be an investor

00:35:24
helps drive up the valuation because they're getting spend

00:35:27
back on their system. There's all this revenue round

00:35:30
tripping going on. I think it'll definitely raise

00:35:32
another round or two. So I feel like it's super safe

00:35:35
bet. I think I'm getting it if we

00:35:37
know it's reported to be 90 billion.

00:35:39
I'm getting it like, yeah, with. With.

00:35:41
But if Microsoft buys, if Microsoft buys an additional 2%

00:35:46
to go from 49% ownership to 51% ownership, that's that's like an

00:35:49
exit in my opinion. Like that's an exit.

00:35:51
I agree. Yeah, yeah.

00:35:52
What do you mean no? They have majority ownership of

00:35:55
Open AI. I mean it's it's a scenario of

00:35:57
Microsoft at that point. No, it's just over then, like.

00:36:00
Yeah, I think. So no, these are if there are

00:36:03
deals on Open AI, they value it as more.

00:36:06
It's worth more, definitely. If other people, if people are

00:36:09
getting shares at a higher price.

00:36:12
There there will be no shares of Open AI, though in this context

00:36:14
I don't know we can we can see what actually plays.

00:36:16
At it, the valuation goes up and there are people who own a lot

00:36:19
of it, then yeah, it it has its own valuation.

00:36:23
There's no way that you're saying, well, well, I I I guess

00:36:27
the shares have to dissolve is what you're saying for it to be

00:36:29
an acquisition that's kind of. Yeah, yeah.

00:36:32
Once they're in Microsoft, I accept that.

00:36:35
Like I don't get to ride up. The Microsoft benefit.

00:36:38
Of. Even though I do think that's a

00:36:40
possible outcome, but. I I just don't think they ever

00:36:42
make it to whatever a trillion dollar valuation is a stand

00:36:46
alone. I don't know if Sam Altman would

00:36:48
sell, oh, you know, away Control to Microsoft.

00:36:51
That's I don't think so either And like there's antitrust and

00:36:53
like I think this is the most Microsoft is probably going to

00:36:56
own. So, but I I'm not willing to

00:36:58
accept like if they own 60%, I'm like no, that's fair, that's

00:37:01
fair. OK.

00:37:02
All right, all right. All right, Mac.

00:37:04
Max. I will take data bricks at #2

00:37:11
Yeah smart, pick smart I. Like I like Data Bricks, I I

00:37:15
like to think that I mean the goal here is obviously to really

00:37:20
get a a couple huge winners or one really huge winner, right.

00:37:24
And I think that Data Bricks is obviously already a very large

00:37:27
company, you know 10s of billions of dollars and I think

00:37:30
there's probably many 10s more or or potentially hundreds in

00:37:33
headroom there and so. Yeah, I don't know.

00:37:36
I think it's a good call option on a really massive company.

00:37:38
So, So I'm grabbing data bricks and also my favorite speaker

00:37:41
from CV1 and Ali goes to the CEO and he's coming back for CV2.

00:37:46
So I'm just a little Homer pick here.

00:37:48
Smart. Yeah, this is giving me now.

00:37:50
My stomach's turning. I do data bricks like I've been

00:37:52
super bullish about them for the long time.

00:37:54
So not to have them in my list is it hurts.

00:37:58
All right, James, pick 3. All right, I've got two that

00:38:01
I'm. You get two in a row, Snake

00:38:04
draft. You get two in a row.

00:38:05
I get two. OK, I just do both of them.

00:38:07
Yes. All right.

00:38:08
I am going to go with hugging face as my first pick.

00:38:13
Wow, that's that's high. That's high.

00:38:15
I think you could have gotten the value bet later in the later

00:38:18
in the draft there. No, I might have.

00:38:20
No, it sounded like Eric was going was eyeing A hugging face

00:38:23
as well. Yeah, I think.

00:38:25
I just, I just really think Hugging Face is going to exit it

00:38:27
under 20 to Microsoft basically. So that's well that's good for

00:38:30
me though that's like a pretty good lock in of 20 million of

00:38:33
you. Need you need to stack those

00:38:35
hundreds to win this draft, James.

00:38:37
We're going up against the trillion dollar open AI here.

00:38:40
So then you should only pick foundation model companies, you.

00:38:43
Should pick whatever you think can go to a trillion Eric, you

00:38:45
know it could be anything hugging face has a lot of room

00:38:48
for I think potentially getting to that 100 billion level if

00:38:53
they kind of maintain with their position in in essential you

00:38:58
know position in the open open source community and they

00:39:02
potentially even build tooling around this that helps other

00:39:06
companies use open source in their in their enterprises.

00:39:10
And yeah, I just think there's a pretty high ceiling there.

00:39:17
GitHub sold for 7.5 billion. Yeah I'm just, I'm just saying

00:39:20
GitHub is is everything hugging faces and more in my opinion and

00:39:23
they decided to exit at the top at 7, 1/2.

00:39:26
So I don't know. I mean, I I love hugging face

00:39:28
for the record, we're, you know, big fans, but I don't know, not

00:39:31
not sure about the pick on the upside.

00:39:33
And there are forum at the end of the day like.

00:39:36
Yeah. All right, James, second pick,

00:39:37
hosting. I I mean, we love them.

00:39:38
But for my second pick, I am going with Anthropic.

00:39:44
Huge second runner in the foundation.

00:39:47
Model Wars, I would argue, just recently struck a pretty major

00:39:52
partnership with Amazon and has a lot of the same people who

00:40:00
were originally working at Open AI.

00:40:02
So I think, yeah, and I've heard really, you know, positive

00:40:06
things from developers about Claude in general.

00:40:09
So yeah, I'm really excited about Anthropic.

00:40:12
OK, this is a tough pick. I'm going to, I'm going to skip

00:40:17
through the really high fundraisers here for my second

00:40:20
pick, go a little more deviant. I'm going to take Pine Cone, the

00:40:25
database company. I think that what I'm hearing

00:40:30
from my little birdies, Little birdies in the, in the AI

00:40:34
communities, Pine Cone is the the database of choice for AI

00:40:40
developers of all stripes. And I think if you look at the

00:40:44
history of Silicon Valley, you see really strong exits from a

00:40:48
lot of different database companies.

00:40:50
So yeah, I mean I I like Pine Cone as I don't know, I mean

00:40:54
this would be a really good pick if I could get you know

00:40:57
discounted for the fact that their valuation is way lower

00:40:59
than most of the companies on this list.

00:41:00
But regardless, I'll take the upside.

00:41:03
I, you know, as a reporter does went went to a VC for advice

00:41:07
before this. I got one because I asked at the

00:41:09
last minute but he said pine cone not there on his list.

00:41:13
He only gave me his five because I think her #3 enterprises are

00:41:18
wasting tons of time building toy prototypes that will

00:41:21
ultimately be replaced. All I all I care about is just

00:41:26
the crazy upside case and I think that's that's what I'm

00:41:29
going for here, so all. Right.

00:41:30
I'm gonna do I get 2 now? I don't know.

00:41:32
Do you get 2 now? Do you get to?

00:41:33
Yeah, you get 2. Now you get to swing.

00:41:34
Back. I'm like, nervous.

00:41:36
I'm gonna take inflection. Please explain.

00:41:46
You got to defend your picks here, Yeah?

00:41:52
With the idea one sort of top talent obviously like DeepMind

00:41:57
Co founder Mustafa Suleiman. I'm not picking him using

00:42:00
speaking but you know I I do think super legit company.

00:42:06
I mean obviously you're betting a lot on the future.

00:42:09
I do think you know if it's go for big swings, huge foundation

00:42:14
model potential is key smart technology I think.

00:42:21
I think also just like give it, I feel like there's a floor and

00:42:24
that it's also like a potential like acquisition to just like

00:42:27
get some big tech company capable worst case.

00:42:33
So yeah, I'm going inflection for #2, OK.

00:42:39
And this is a tough one. All right.

00:42:42
I'm taking character. AII Just feel oh man.

00:42:45
I feel like that bad pick bad. Pick.

00:42:49
You would have taken it, right? Yeah.

00:42:51
Bad. You would have taken it.

00:42:53
Or the pig bad. No, no, no.

00:42:54
Bad pick is just a way I'd joke about every pick being bad.

00:42:57
Sorry, would you have taken character like I never would

00:42:59
have, but it sounds like James would have so.

00:43:01
I think I would have taken it, yes.

00:43:04
I'm going on a foundation model, forward strategy and I I think

00:43:09
character, you know, has. Perhaps a clearer path to a

00:43:13
product than most, I think. I mean they have a very talented

00:43:17
CEOI mean Theresa Horowitz is going big on them.

00:43:21
So what they're going to get lots of follow on funding

00:43:24
rounds. Yeah, we'll see.

00:43:26
I don't know. I you know something I I still

00:43:31
think like the, you know, right now the bot sort of relationship

00:43:35
bot thing is a little overhyped, but.

00:43:38
You know, there's time. What do you think about

00:43:40
character as it is today? What is your?

00:43:44
Do you agree with The Take that it's all sex chat or or?

00:43:47
Not, I don't, I don't know, I'm not like an expert on the

00:43:49
product. But to me there there's sort of

00:43:52
you got a dual bet in that they're super hardcore building

00:43:55
their own foundation model deep on technology and they're

00:43:59
actually have a product that's out there that people are

00:44:02
finding fun ways to use. And like, I mean, you know, Mark

00:44:05
Zuckerberg, that's sort of with hot or not so like anything.

00:44:08
That that they have the sort of instinct to say like let's get

00:44:11
users, that that to me is like a good, good muscle.

00:44:15
And like, even if this first iteration isn't like the one you

00:44:18
want the New York Times to be writing about, I think I think

00:44:21
it's a good sign. I think it's a good sort of out

00:44:25
of the money. You guys would not have.

00:44:26
Picked. Inflection though?

00:44:29
No, I I think. Inflection's a reasonable I like

00:44:31
your picks. I like your picks.

00:44:32
I don't. I honestly don't really know

00:44:34
what these foundation models, which is why I'm going to pick a

00:44:36
foundation model next, baby. Yeah.

00:44:42
I mean, I literally think I could be choosing foundation

00:44:44
models. Pass open AI and anthropic as a

00:44:48
a monkey. Throwing darts at the proverbial

00:44:50
board would do better than me here, but I'll just go with Co

00:44:53
here. I'll just take the market

00:44:54
deciding it's worth a lot. It seems like it's the number 3

00:44:57
#4 best foundation model. Who the F knows is my defense of

00:45:02
this pick, but I got to grab a foundation.

00:45:05
Everybody wants to be in business, so I feel like they're

00:45:08
going to face a ton of competition.

00:45:10
They were early, sure. Yeah, I'm.

00:45:13
I'm not. I'm, I'm.

00:45:14
I'm freely willing to admit this with them.

00:45:15
I'm I'm freely willing to admit this is my lowest conviction

00:45:18
pick here. But I just feel like I got to

00:45:19
have a foundation model, and that to me is the best pick on

00:45:21
the board. So so be it.

00:45:23
Onward and upward. Maybe I'll get lucky.

00:45:25
OK, I'm. I'm ready with my 2 picks.

00:45:27
All right? Please fire away.

00:45:28
All right. 1st pick go go on with another Foundation model of

00:45:33
course. Got to have one.

00:45:35
I will say you can see why VCs are investing big in foundation

00:45:40
models, because if the goal is to go for the fences, you're

00:45:43
like you know, foundation models are are are you know big swings.

00:45:48
And so, like nobody here is really so far trying to

00:45:51
accumulate like a but like I could name some sure things but

00:45:57
then. You know you want like a swing

00:45:59
anyway, James. I'm going with AI 21 Labs, which

00:46:04
is a Israeli foundation model company.

00:46:09
Solid. Just just announced $155

00:46:13
raise in August at a $1.4 billion valuation and.

00:46:21
The. Participants included Google and

00:46:25
NVIDIA. They create foundation models

00:46:29
particularly targeted at writing, and they are known for

00:46:34
their cutting edge Jurassic models.

00:46:37
So yeah, I'm impressed with the founding team and the product.

00:46:41
Yeah. And they they've got some, I

00:46:42
think they have a partnership maybe with Amazon, like I've

00:46:44
seen them come up, but it feels like they're under the the radar

00:46:48
a little bit. I think it's a little bit of an

00:46:49
under the radar pick, but a solid that.

00:46:53
Makes sense? Foundation models, I mean I

00:46:55
would say that is, you know, if if they are Israeli and they're

00:47:01
valued at 1.4, probably in the US they'd be valued at 14

00:47:04
billion. So I think, yeah, exactly like

00:47:09
you know that that seems like a good, good, good cost of living

00:47:13
adjustment right there. All right and with my next pick

00:47:17
I am taking Replet. I'm Jod.

00:47:21
Yeah, another CVCVAI alumni Alumnus if you will.

00:47:28
So replica. You have ugly Face and replica.

00:47:31
They're on the same panel together.

00:47:32
You're just a you're just a big CVAI chapter here.

00:47:36
I love it. I like to meet the founders and

00:47:38
get to know them. Practicing for In the end, it's

00:47:46
really, it's really just a bet on the founders.

00:47:48
It's really about the people. It's about the people.

00:47:49
It's really about the founders. What do you think about this

00:47:52
stuff? Yeah, Founder market fit, if you

00:47:54
will. All right, What?

00:47:55
What's your? I'm definitely as founder.

00:47:57
Twitter are are repla and hogging face like in the like

00:48:01
trillion dollar case. They're like fighting with each

00:48:03
other right? They're both like in the coding.

00:48:05
World, I don't. I don't agree with that.

00:48:06
I think that repla, the trillion dollar case here is that

00:48:10
software engineering. Changes dramatically due to AI

00:48:15
and we enter this world of everyone's an engineer, you

00:48:20
you're using a cloud based IDE, you can use AI really

00:48:25
aggressively as you're coding and you don't even need to

00:48:28
really learn how to code. And I think they're the ones who

00:48:30
are furthest along at kind of building that vision that I, I

00:48:34
do believe is possible with AI that was that was not on my

00:48:37
draft board, all right. Yeah, I don't think I would have

00:48:41
picked it. Yeah, I'm going to go with

00:48:44
another Homer Pig. I will be taking Modular,

00:48:48
another founder. I will be interviewing on stage

00:48:51
at CVA I2 for the for the back story here.

00:48:54
For the back story here. Chris Lattner, who Co founded

00:48:57
Modular, literally invented the Swift programming language that

00:49:01
is on every iPhone, every Mac, every iPad in the world.

00:49:05
Probably one of the top five to 10 most used programming

00:49:08
languages ever invented. So he's obviously a beast.

00:49:12
He has shown tremendous facility in creating really low level

00:49:17
technical products that are used by literally hundreds of

00:49:20
millions of people, if not billions.

00:49:22
Before that he created a tool called Low Level Virtual

00:49:24
Machines that is also basically everywhere.

00:49:26
So I'm just saying Chris Lattner, he gets WS and he is

00:49:31
trying to reinvent the programming language for AI and

00:49:34
replace Python with a new language called the Mojo that

00:49:37
modular is invented. And so, look, I frankly don't

00:49:40
know what the business model is here or where we're going with

00:49:43
this part, but you know, Chris Ladner puts points on the

00:49:46
scoreboard. So that's what I'm hoping he'll

00:49:48
do for my team here. So modular.

00:49:49
AII am breathing like a huge sigh of relief because I took

00:49:54
character instead of this company thinking the character

00:49:56
was going to be a more. Competitive pick and I think you

00:50:01
were right, proved true and this company did not get scooped up.

00:50:05
I am taking what I would have actually probably put as my

00:50:09
third pick, but strategically dropped to 4 clean which is sort

00:50:14
of doing corporate search. Sequoia Back Company,

00:50:17
Lightspeed, Ravi, the key guy at Lightspeed is on it, and Mamoon

00:50:23
Hamid at Kleiner Perkins is an investor in this company.

00:50:26
So I just feel like it's one of these, Like, I don't know.

00:50:29
What do they all know? So you were really like a

00:50:32
venture capital fundamentalist here.

00:50:35
Like, yeah, like, you know, Eric's like, have you ever heard

00:50:40
of that fallacy of appealing to authority?

00:50:42
Well, I go the other way and say it's the way.

00:50:45
Well. Like, you know, like you should

00:50:46
always appeal to authority. I mean what we don't have the

00:50:50
financials, what are you doing? You're looking for a signal

00:50:52
like. I mean what is better signal

00:50:54
than like? Sequoia, Mamoon and Ravi like I,

00:50:57
you know I don't know what it is.

00:50:59
I think they're good VCs and like I I don't have metrics So

00:51:02
what am I going And honestly my VC guru that I went to here's

00:51:06
his case for glean semantic search over business data is the

00:51:10
biggest use case. This opportunity overshadows all

00:51:14
the in house builds people are using link chain etcetera for.

00:51:17
So instead of instead of cobbling it together yourself.

00:51:22
Use glean like I, I, I buy that. Wasn't, wasn't this the Tanium

00:51:26
pitch, which also like you know, is sort of, Oh yeah, laying

00:51:29
around? You know you should.

00:51:30
You need to look into the eyes of the Co founders, right?

00:51:33
Like, are you gonna actually help your investors make money

00:51:35
out of this company anyway? OK, well there you go.

00:51:40
OK. So now I have a last one, This

00:51:43
one I'm I'm debating between like product that's solid,

00:51:47
protect my sort of minimum versus like upside.

00:51:51
Potential man, this is like, I I have empathy for VCs.

00:51:56
It's like, should I go crazy, like hoping to dream, you know,

00:52:02
or should I go sort of like product that people actually use

00:52:07
today. I'd like to think you would do

00:52:09
more than 10 minutes of research, which is what we've

00:52:11
done on this. I don't know.

00:52:13
Did do VCs do that? I don't know though.

00:52:16
They get, they get the financials.

00:52:17
We don't get any. I guess some of these companies

00:52:19
haven't like you. You already did what most VCs

00:52:23
do, which is as their VC. Friends who they should invest

00:52:26
in, just go with that. It works.

00:52:31
Lots of people, a lot of money doing that.

00:52:35
All right, I. I'm going for the the hope and

00:52:40
dream of the entire European continent.

00:52:43
Mistral AI Oh, my gosh. I thought you were going to say

00:52:47
something else, but yeah. Yeah.

00:52:49
Vladimir Zelensky. All right.

00:52:52
Mistral is, you know, super hypey.

00:52:55
I think Lightspeed is an investor European, you know,

00:53:00
they they're very early, super, super hypey, but you know.

00:53:07
Smart, smart founders. I'm really enjoying this draft

00:53:10
because I really would not have picked almost any of the

00:53:12
companies you guys picked. And I feel interesting.

00:53:14
That must apply to me too. So yeah, we can review each

00:53:17
other's teams at the end here. I know who these.

00:53:19
Six would be. So I'll tell you at the end I'm

00:53:21
gonna, I'm gonna I'm I am also going to go hoping to dream here

00:53:25
a little bit more. So I obviously read you know

00:53:30
read and researched all the founders of these companies and

00:53:33
I was very taken by this founder's vision of self

00:53:37
improving foundation models. So the idea of recursive

00:53:42
improvement of a foundation model and so that was the pitch

00:53:46
of imbue our friend Ken who on a hope and a dream and I assume

00:53:53
some really solid demos has raised $234 million.

00:53:57
So so but I think if you're going for true take off trillion

00:54:03
dollar companies as it were the first person to figure out how

00:54:07
to make AI recursively self improving Will has a good shot

00:54:10
at being a trillion dollar company.

00:54:12
So I'll, I'll, I'll put some some chips on Kenjon and see

00:54:15
what happens. Yeah.

00:54:16
I think they're very bullish on like agents, right, I mean,

00:54:18
right and self improving agents as well which I thought was

00:54:21
exciting. So I'll throw, I'll throw $5

00:54:25
into that $234 million pot and help them get there.

00:54:29
All right, James, Final pick, final pick.

00:54:32
All right. With my final pick, I am taking

00:54:35
adept, interesting, very highly valued cerebral values.

00:54:42
CV alum, Yeah. Obviously super high valued and

00:54:46
highly funded. As far as I can tell, they've

00:54:50
raised, yeah, close to half a billion dollars, right?

00:54:53
Woo Hoo, Hoo Hoo. Essentially they are they are

00:54:56
trying to create. A new way to interact, Have AIS

00:55:03
interact with computers. Essentially allow creating

00:55:06
agents that can control computers.

00:55:08
I don't even know I I put this as a copilot I guess, but I

00:55:12
don't even know if this is like this.

00:55:15
Seems like a brand new type of concept.

00:55:19
You know, creating a agents, AI agents that can interact with

00:55:22
computers on your behalf. It's a pretty science fictiony

00:55:26
idea, so you know, could be a. Huge flap.

00:55:31
Or it could change the way you know the world.

00:55:33
Works founders have left in droves.

00:55:35
I feel like lots of internal turmoil.

00:55:39
I feel like. It was hiked earlier other found

00:55:43
you know there there was a founder break up at the company

00:55:47
which which is certainly a negative sign but little scary

00:55:52
but but you never know stop off from from the ashes rises the

00:55:55
phoenix. I mean the.

00:55:58
The CEO is, you know, David Luan, a former I think VP at

00:56:02
Open AI. Certainly is there sort of early

00:56:05
days, super smart guy. But yeah, can't.

00:56:09
Can't keep the original team together.

00:56:12
Interesting, I just think. I just I think working on AI

00:56:15
agents in some capacity that can like to your earlier point about

00:56:20
self self improving AI and you know reinforcement or maybe Max

00:56:26
you were talking about that. I think there's something there

00:56:29
that could unlock a lot of value, so I'm excited that

00:56:32
they're working in that area. Sure, got it.

00:56:35
My zoom out I'm I'm most stressed that I should have just

00:56:41
hoped for a second and gotten data bricks for.

00:56:44
Cheap because I do think next you got a great deal on this

00:56:46
Section 1 there, yeah? And data bricks is like a pretty

00:56:49
sure thing, almost more sure than opening eye.

00:56:52
I feel like the upside. Our producer says he's picking

00:56:58
the script because we're editing the podcast with that and also

00:57:00
that was one I was seriously considering for five working

00:57:04
product, but I didn't know how much technical under like what

00:57:08
their level of. Heck, sophistication was so I

00:57:11
didn't pick it. What?

00:57:11
What are, what are other ones on the board, left on the board,

00:57:14
Anything you guys are? Surprised, notable.

00:57:16
I mean, Jasper obviously is like, everybody's afraid of,

00:57:19
yeah, rapper. I feel like Snorkel has gotten a

00:57:22
lot of hype with like internal data cleaning.

00:57:25
There's companies. That's what I've heard.

00:57:26
Yeah. Yeah, I they're they're always

00:57:28
bouncing around when I'm hearing.

00:57:31
Writer raised a bunch of money at 500, so I'm

00:57:34
interested to see sort of there was sort of a good price.

00:57:38
Lots of money back. Lots of money.

00:57:40
Good price for investors. It's it's interesting, humane.

00:57:43
Which makes that little pin that's supposed to replace your

00:57:45
iPhone that you wear on your jacket or whatever that you talk

00:57:48
to. Nobody put that on the board.

00:57:51
That seems like a a tough, tough sell Hard, but.

00:57:55
Hardware is hard. I just hardware is very hard.

00:57:57
Yeah. Yeah.

00:57:58
They just launched it at Paris Fashion Week, which is odd.

00:58:01
I don't. Know I mean scale you know was

00:58:03
making bank off of. Growing humans and self driving

00:58:06
car problems and now they're trying to make bank at human

00:58:10
reinforcement learning for foundation models.

00:58:13
But it's just hard to believe, you know, it's humans.

00:58:16
Like they just don't seem like that.

00:58:19
You want? We don't need.

00:58:20
We don't need them anymore. We.

00:58:21
Don't need. I know I want a big like don't

00:58:23
need this middle ground thing. I mean stability not being in

00:58:27
the discussion, you know compared to last six months ago

00:58:30
is is quite interesting. Obviously there's been, well,

00:58:33
first of all, there's been a ton of competition and you know, Mid

00:58:36
Journey has gotten super good and all that stuff.

00:58:37
And then secondly, yeah, obviously there was a story

00:58:39
about the founder making a bunch of stuff up, which was

00:58:41
interesting. I think the journey is was left

00:58:45
off the board here because they haven't taken VC funding by the

00:58:48
way, but they could be a super valuable.

00:58:51
Yeah, I know hard to maybe never get a valuation into the cash

00:58:53
flow business, but mid journey man I would definitely have

00:58:56
taken that before Mistral I think interesting the I mean.

00:59:01
Yeah, I think weights and biases really good company.

00:59:04
But I'm just like, what sort of, is it really going to gobble the

00:59:07
world? Runway Love the founder.

00:59:10
Like he's so charming. But, you know, I feel like

00:59:14
maybe, you know, the narrative there is like, does Adobe wish

00:59:18
they could have bought Runway for like way less than it paid

00:59:21
for Figma? You know, when Runway is sort of

00:59:23
the relevant company now? Yeah, I mean, lots of great

00:59:27
companies. I wouldn't write off most of

00:59:30
them. I don't even know what go

00:59:32
student is. I didn't look at him.

00:59:33
That's the most high fund rate. Did anybody research Go Student?

00:59:37
I researched it a bit. So my understanding is they're a

00:59:41
European company who kind of got started in this AI in this

00:59:46
tutoring space and raised a lot of money and now is building LMS

00:59:50
to do tutoring. But I was a little skeptical of,

00:59:53
like, whether that was a, you know, current product or not or

00:59:57
prototype. Yeah.

00:59:59
So that's why I kind of ruled them out.

01:00:00
My last point would just be, I was really struck in doing the

01:00:02
research how unbelievably difficult it is to even tell

01:00:05
what the products of a lot of these companies are or how

01:00:08
they're differentiated. And I mean, you could really

01:00:10
just like throw buzzwords like large model learning like

01:00:15
reinforcement something something internal data,

01:00:18
external data, whatever. It just spit out a new Tech

01:00:21
Crunch article with like a fake company named him every day and

01:00:24
I would find them very similar to a lot of the research I did

01:00:27
for this. So it's.

01:00:29
I mean poolside, you know, like I've written about, it's for

01:00:31
code. I mean partially it's an

01:00:33
earlier, it's like earlier on the foundation model journey.

01:00:36
But yeah, there are a lot where it's like, man, it is.

01:00:38
It is remarkable how many similar ish like either Data or

01:00:43
Foundation model or something about enterprise plus LL Ms.

01:00:47
companies there are that are in this category.

01:00:49
It's it's amazing. To me there's my my zoom out on

01:00:53
my view is that like Max, I mean a lot will fall on data bricks

01:00:59
of course. And that anthropic could be the

01:01:02
people plausibly think anthropic could beat open AI.

01:01:05
In which case, like, James got a great pick discounted open

01:01:10
anthropic. Anthropic is the only one I like

01:01:12
from James. Yeah, really.

01:01:15
Anthropic is when I'm. I want.

01:01:17
I like your, I like your I like your Israeli one.

01:01:20
AI 21 also that's kind of interesting.

01:01:22
Yeah. What?

01:01:22
Which ones do you like from Eric?

01:01:24
I guess the, I mean opening eye obviously is untouchable, but I

01:01:27
think I I think inflection is also quite strong.

01:01:31
The rest, I'm not so sure. Yeah, character.

01:01:33
Man. And whereas your your team's

01:01:36
just rock solid, my my team is Immaculate.

01:01:39
No, I would say other than databrakes, I have no sure

01:01:41
things but I really took I took call options on hopefully giant

01:01:45
upside. Like I could see my team having,

01:01:47
I could see my team having three zeros on it.

01:01:49
But as long as the two are big, I'm, I'm, I'm going for the win

01:01:53
that way, yeah. We we really had to do a

01:01:57
lightning, a speed run of of learning VCs portfolio strategy

01:02:02
here. We we do not have.

01:02:04
The date, you know any we don't get to talk to anybody.

01:02:07
Weird is severe. This is not real VC.

01:02:09
We were playing basically with blindfolds on, but it was very

01:02:12
enjoyable conversation piece. It's kind of funny that we had

01:02:15
to like pick companies that could return the fund.

01:02:17
You know, they all have to. Like, yeah, runs, right.

01:02:21
Yeah. All right.

01:02:22
Very excited to see you in five years.

01:02:24
Hopefully we'll follow up and like some of us will look like

01:02:27
maybe all of us will look like idiots in this space will be

01:02:30
wildly overhyped and maybe my, you know, -75 billion will like

01:02:34
swamp the total value and then we can just.

01:02:38
I don't think that's likely, but as a closer or do you just want

01:02:41
to read off everyone's team? Oh yeah, For people not

01:02:44
watching, the teams are. Team Eric with a negative $75

01:02:49
billion handicap that I paid to go first so I could scoop up

01:02:53
Open AI #1, Inflection #2, Character A, Character AI #3,

01:02:59
Glean #4, and Mistral AI #5. Max you want to read yours?

01:03:05
Yeah. My team is Databricks, Pine

01:03:07
Cone, Cohere Modular, and IMBU. Also known as the best team.

01:03:13
And James. I went with Hugging Face,

01:03:16
Anthropic, AI, 21 Labs, Replet, and Adept.

01:03:22
That's our episode, and that's the Cerebral Valley AI series.

01:03:28
Watch for YouTube videos. I'm sure I'll have Max and James

01:03:31
once I'm like not fatigued of talking back to sort of reflect

01:03:35
on it, but I'm not promising it immediately.

01:03:38
But this has been a great. Run Thank you for sticking with

01:03:42
us. And you missed some of the

01:03:44
earlier episodes. Go back and listen to them.

01:03:48
Thank you so much to Scott Brody, who's been the producer

01:03:51
through all this and help help us figure out the show.

01:03:54
So shout out to him. Thanks to Riley Kinsella, Gabby

01:03:57
Caliendo, both key and sort of getting the conference and

01:04:02
everything together. Thank you for everyone who's

01:04:05
going to speak. And yeah, stick around.

01:04:10
Please, like, comment, subscribe on YouTube, subscribe to

01:04:14
newcomer.co and go play song quiz on your Alexa.

01:04:19
All right? Yeah.

01:04:20
Thanks so much. Thanks guys.

01:04:23
Thank you all. Right.

01:04:24
Bye.