We’re officially one week out from the Cerebral Valley AI Summit! On today’s episode, co-hosts James Wilsterman and Max Child of Volley join host Eric Newcomer to preview what’s ahead — from standout speakers to can’t-miss panels and the big ideas that will shape the conversations next week.
To kick things off, Eric poses a timely question: What themes are starting to take shape across the participants and topics at this year’s summit? What’s really driving the energy in AI right now?
Here are a few of the themes that emerged from the discussion:
- Designers are moving closer to engineering, not just prototyping but launching internal tools and shaping product development in deeper ways.
- The war for context is just beginning — expect fierce competition over who owns the layers that make AI actually useful.
- Vibe coding: is it a real paradigm shift, or just a fun phase? And how big could it get?
- Is there a Microsoft Office Suite for the AI era?
- Distribution vs. product: what do the strategies of Uber and Waymo reveal about the future of AI deployment? Who wins self-driving cars?
- Text box solution vs. product: should everyone be copying ChatGPT, or is that a mistake?
- The growing appetite for data is fueling a new surveillance state.
- And finally, Eric's "Sprinting Toward the End of History" — companies face competition from every direction, but is there an end point?
It’s all building toward what promises to be a packed, thought-provoking week in Cerebral Valley. Let’s dive in!
Timestamps:
1:40: Eric poses the question
1:55: Evolution of the role of the designer
8:00: Text box vs product
14:15: Unbundling ChatGPT
19:19: Is there a Microsoft Office Suite for the AI era?
22:10: Who owns the context
26:02: Surveillance state
36:57: Distribution vs product
42:00: Sprinting until the end of history
00:00:00
This episode is brought to you by Forethought.
00:00:03
Most companies build their customer experience in pieces.
00:00:06
Sales in one system, support in another, and onboarding
00:00:09
somewhere else. Forethought brings it all
00:00:11
together. Forethought is an AI system
00:00:14
made-up of advanced agents that handle sales, onboarding,
00:00:17
support, and retention. Each team manages its own
00:00:21
agents. The customer sees one unified
00:00:23
experience. Forethought powers over a
00:00:26
billion interactions every month for brands like Scale AI,
00:00:29
Cohere, Air Table and Upwork. Learn more at Forethought dot
00:00:34
AI. Welcome to the Cerebral Valley
00:00:44
podcast. I'm Eric Newcomer.
00:00:46
I'm here with Max. Hey, Eric.
00:00:48
And James? Happy to be here, looking
00:00:51
forward to the conference next week in London.
00:00:54
Next week for our returning listeners, you know I'm Eric
00:00:57
Newcomer, the author of the Newcomer newsletter and I have
00:01:00
Max Child and James Wilsterman, the Co founders of Volley and my
00:01:04
Co hosts of the Cerebral Valley AI Summit.
00:01:07
Some of the highlights will be in this podcast feed.
00:01:09
They'll all be on our YouTube channel.
00:01:11
We're going to hear from Dark Hazra Shahi, the CEO of Uber,
00:01:14
Dylan Field, the CEO of Figma Box Kendall, the CEO of the self
00:01:19
driving car company Wave Victor, the CEO of Synthesia, the CEO's
00:01:24
of lovable granola, Orbee Crusoe photo room.
00:01:28
I'm not going to list them all lots of cool CEO's next week.
00:01:32
And so Max James and I assign ourselves to come up with three
00:01:36
themes each that that hopefully are not just agents are
00:01:39
important and sort of talk through what we think some of
00:01:44
the big themes are going to be in there for.
00:01:46
Sort of think about about what we want to hear from our
00:01:49
panelists. Who wants to go first with one
00:01:52
of their themes. Yeah.
00:01:53
So I think a big theme, especially since we have Dylan
00:01:57
from Figma speaking, will be this evolution of the role of
00:02:02
the designer. And maybe this will apply to
00:02:05
other roles as well. But I think we're already seeing
00:02:08
it a bit at Volley designers being able to make much more
00:02:12
interactive, fully executed programmable prototypes in a way
00:02:17
that like really multiplies their impact, their speed to
00:02:21
iteration. Because they don't have to hand
00:02:23
it off to a coder, yeah. Exactly.
00:02:25
They're, we're cutting out that step in that early phase of
00:02:28
design or even for like things like internal tools, which don't
00:02:32
have to be like fully perfect when they launch, right.
00:02:35
We're allowing designers to kind of contribute directly to the
00:02:38
engineering in a way that was never possible before.
00:02:40
And I think that's just got it getting started and I think
00:02:42
Figma's leaning in there as well.
00:02:44
There was that old discussion back in the early days of
00:02:47
streaming movies and TV where they were talking about at
00:02:50
Netflix, Like they were saying, you know, our goal is to become
00:02:53
HBO before HBO can become Netflix, basically.
00:02:57
And in the end, they kind of both won.
00:02:59
But I guess you could probably say Netflix won a lot bigger
00:03:01
based on the stock prices these days.
00:03:03
But I think similarly, we're seeing internally and a lot of
00:03:06
other folks we talked to that there's this question of like do
00:03:09
designers become engineers faster than engineers become
00:03:12
designers? Or I love it.
00:03:14
That's. Great.
00:03:14
You could include product people as sort of part of the design
00:03:17
cadre of like, you know, do designers and product managers
00:03:20
become able to create software products faster than people who
00:03:24
can already create great software products are empowered
00:03:27
and enabled to design and ship and sort of finish complete, you
00:03:31
know, products in a way that they weren't in the past.
00:03:33
And I think that we've seen just amazing gains in the
00:03:35
productivity of our engineering team, right?
00:03:37
I mean, I think some people are like 3 to 5X as effective as
00:03:40
they were a year ago. But I think you're really seeing
00:03:43
like 0 to one games from the designers and the product folks
00:03:46
who are now making websites and making mock ups and making
00:03:51
internal tools and pages and stuff that literally they could
00:03:55
not have done any of 6 to 12 months ago.
00:03:58
So there's a true kind of, yeah, step change happening.
00:04:01
And I think obviously Figma and a lot of other folks are kind
00:04:04
of, sort of. Lovable from the other lovable.
00:04:07
Of course we're gonna have both of them.
00:04:09
Yeah. Do you want to explain lovable
00:04:11
for our audience? I feel like a lot of some
00:04:12
people. Yeah.
00:04:13
Familiar. For people like me, it's for no
00:04:15
code. People who have no sense of
00:04:17
code. I mean, literally, I think you
00:04:18
have to imagine you're taking a screenshot of a website you
00:04:21
like, sending that to Lovable and saying like, build this with
00:04:26
XYZ. You go back and forth with it
00:04:27
and it's just like it's building with you.
00:04:30
Is the preferred flow that you take a screenshot of a website?
00:04:33
By the way, I. Don't think they're selling
00:04:34
that, but I think people understate that like so much of
00:04:36
what's great about AI in every domain is that it's really good
00:04:39
at ripping things off. And so lovable works really well
00:04:42
when you're like, I like, I like the New York Times, I like the
00:04:46
Substack design. Why, why don't you borrow from
00:04:49
that? Here's my color palette.
00:04:50
You know you, I mean, you can give it, you know, your logos
00:04:54
and I'll put them in and I'll take your color palette.
00:04:56
But it's really sometimes good at being derivative.
00:05:00
Yeah. Well, you got to speak to your
00:05:01
personal experience, right? Like you, you tried to build a
00:05:04
full website, unlovable, right? And you've never.
00:05:06
You should explain this whole experience, what happened.
00:05:08
I was trying to make like sort of a news aggregation site,
00:05:12
basically like get the top headlines in tech news and so I
00:05:17
could build like a comment function and you know, it was
00:05:20
pretty cool, but then I got hung up.
00:05:22
You got stuck. I moved this button and and like
00:05:25
no matter how many times in ways I'd ask it and you know, you'd
00:05:28
be taking your conversation and dumping it into Anthropic and
00:05:31
ChatGPT and still it just like for whatever reason, it just
00:05:35
like it got stuck. So I had sort of something
00:05:38
visually cool, but then it didn't turn turn into anything.
00:05:42
So. You sort of had this last mile
00:05:44
problem. I think this is another I don't.
00:05:46
Know that's a good conference. We didn't put it.
00:05:47
Yeah, yeah. All right.
00:05:49
That's not one of our official themes or it could because
00:05:51
that's good because we have the self driving car companies and
00:05:54
they are the the sort of test case of the sort of you have to
00:05:59
get the last 1%, you get the last 5% and a lot of these
00:06:02
products look really good. And then they yeah, I like it
00:06:05
last mile problem. That's a good.
00:06:07
The last, the last mile problem is overall theme of the
00:06:09
conference, I think. Yeah, I think lovable really
00:06:12
gives you that last mile problem experience where you feel like
00:06:15
you could build 98% of the website, but it doesn't work
00:06:17
without the last 2%. And you're like, God, I was so
00:06:21
close. And, and the question is with
00:06:23
the next generation of models or the next generation of tools or
00:06:27
just lovable itself becomes better, you know, do we cross
00:06:30
that last mile? You know, I don't know, James,
00:06:33
you've experienced this from the engineering side of things.
00:06:35
Someone was tweeting about this idea that coding models would
00:06:40
allow finally everyone to complete those like 90 side
00:06:44
projects that they have sitting around on their computers that
00:06:46
they have just been, you know, tinkering on for decades but
00:06:51
never get finished. And Sam Allman responded and
00:06:53
said, like, yes, so excited to see all those projects get
00:06:57
finished. But I actually think the
00:06:58
opposite is happening to me, at least where it's like 10 Xing
00:07:01
the number of side projects that I have that are unfinished.
00:07:04
You're like, you're like now I have 900 unfinished.
00:07:06
Now I have 900 unfinished, Yeah, so I definitely think this is an
00:07:11
issue, but I don't know, every few months or weeks even, it
00:07:14
seems like the models get more capable and it allows me to make
00:07:19
more progress. So it's less that I'm worried
00:07:21
that this is like a number. You'll have 100 projects come
00:07:24
out. They're all, yeah, they're all
00:07:27
getting close. You're building them in.
00:07:28
Parallel And then one day I'll just smile.
00:07:32
We'll just be completed for. For all I mean, but that that is
00:07:35
sort of how, you know, AI people can sort of talk about, you
00:07:38
know, the singularity or AGI. It's it's like, oh, we're going
00:07:41
to have drug discovery, we're going to have this boom, you
00:07:44
know which, you know, I don't think we're necessarily closed
00:07:46
off to but but yeah, there is this sort of like mode of all at
00:07:50
once. Yeah, true.
00:07:52
The the take off, fast take off or.
00:07:54
Fast take off. Exactly.
00:07:55
Yeah, Yeah, I have a theme that's sort of related to your
00:07:58
design. One, yours is really from like
00:08:01
the employee perspective. This is from maybe corporate
00:08:04
strategy, but the theme is text box versus product.
00:08:10
Like I think Chachi BT has introduced this idea that like a
00:08:13
company, you know, you can just have the company in a text box.
00:08:16
And in some ways, like Lovable is that in some ways Figma,
00:08:20
which is like a very intentionally designed product,
00:08:23
now feels the threat of like, oh, we need our almost text box
00:08:27
solution. And so I think there's a real
00:08:30
question of the moment is, do people want to interact with
00:08:33
their tech products by typing to them?
00:08:36
Or is that just sort of the cludgy interim solution before
00:08:40
people build great product design around the capabilities
00:08:43
of AII? Mean, I think this actually
00:08:46
comes back a little bit to some of the voice discussion we had
00:08:48
last time, but our sort of general thesis at our company
00:08:53
Volley is there's really only like two ways people actually
00:08:57
interface with the world, interact with the world, right?
00:08:59
And it's basically like with your hands, you know, obviously
00:09:04
touching the screen, typing on a keyboard, whatever, playing with
00:09:07
the game controller or by talking essentially or
00:09:11
outputting words, right. And obviously, you know, you're
00:09:14
using typing as the sort of the current way we put words into a
00:09:17
computer. But like we believe the future
00:09:19
will be talking into a computer rather than typing into a
00:09:21
computer. But I do think basically if you
00:09:24
say, hey, humans only have two ways to do anything in the
00:09:27
world, right? With their hands or with their
00:09:29
mouths? Except for some reason driving a
00:09:32
car. Which that's what.
00:09:34
Hands. Probably your feet, your
00:09:35
steering wheel. OK, we use your feet
00:09:37
occasionally for, you know, other things.
00:09:39
But I've always found that kinda bizarre that we we trust our
00:09:42
feet as the with only one task, the most important task.
00:09:47
Anyway, most dangerous what I would be making besides the
00:09:50
pedal driven Photoshop that James is imagining here was that
00:09:54
to date, you know, most creative software has either been, you
00:09:58
know, point and click, drag and drop, move a mouse or move your
00:10:02
fingers touching or it's been, you know, typing in a box.
00:10:04
And I just think that the headroom for what you can do
00:10:07
typing in a box is probably your bullet 10 or 10 or 100 X what we
00:10:12
have done to date because it is one of the major ways humans
00:10:16
interface with the world. It's how we're interfacing right
00:10:18
now in this podcast, right? Anytime you do a phone call,
00:10:21
like what we're not doing on this podcast, we're using
00:10:23
Riverside right now, which is like Zoom effectively.
00:10:26
It's like, yeah, we didn't fire this up.
00:10:29
And being like, what tools do we want?
00:10:30
We need a producer to be able to be in the background.
00:10:32
We need, you know, to be able to toggle what camera we want.
00:10:35
You know, it's just like using a text box is like deriving from
00:10:38
first principle, like everything you want for, you know, I was
00:10:41
creating cute images for my wife last night on chachi tea.
00:10:44
And it's like you have to ask it like, Oh yeah, what are the
00:10:47
different, like color, like, you know, graphic types.
00:10:51
But you probably do. But it's like you have to
00:10:52
reinvent it every time. I just think the text box is not
00:10:55
ideal for most of the use cases in great product design.
00:10:59
I don't think it's ideal for many use cases that we have
00:11:02
developed to date. And I think those aren't going
00:11:04
away anywhere. I don't think that like the
00:11:06
concept of a Photoshop Figma type experience or, you know,
00:11:09
things like that are, are going to disappear.
00:11:12
But to me, it seems clear that like anytime you want a piece of
00:11:17
information or you want, you know, a fairly simple action
00:11:21
carried out on your behalf, it's it's much easier to just put
00:11:24
that request out in the world as texts, you know, Oh, explain to
00:11:28
me this thing. Go research this thing.
00:11:31
Go do this simple thing, you know, even something as simple
00:11:34
as like, you know, DoorDash ordering agent, which we don't
00:11:36
have today. But if you were like, Oh yeah,
00:11:38
go order the pepperoni pizza. I always order from this place,
00:11:40
like, you know, and have it at my house in under an hour.
00:11:43
Like that is still easier than the 95 clicks it takes to do it
00:11:46
in DoorDash. If that actually worked, right.
00:11:48
So I think there's a sort of AU curve of complexity where if you
00:11:51
get really, really high complexity tasks, then of course
00:11:54
you're going to want to use your hands and drag and drop and
00:11:56
touch things and so on. But there's lots of low
00:11:58
complexity tasks, or even medium complexity tasks that you could
00:12:01
do fairly easily with text or with your voice.
00:12:03
And it's almost like you can do it faster that way because,
00:12:07
yeah, it's much faster. Yeah.
00:12:09
Like you don't have to like learn the interface.
00:12:10
It would be much faster if Siri actually worked.
00:12:13
And I was driving home from work and I was like, hey, like order
00:12:16
me the, you know, I get like for, for dinner or whatever, you
00:12:19
know, and they'll be like, Oh yeah, the from this place, it's
00:12:21
like, you know, it's 15 bucks and I'll be like, yeah, done
00:12:23
right. If that worked, that would be
00:12:24
better than using DoorDash. I think, and I mean taking the
00:12:27
Riverside podcast recording application as an example here,
00:12:30
there aren't that many buttons. Like we kind of just send out a
00:12:34
link and join. But no, they're they're way more
00:12:37
than anything like I feel like I do.
00:12:39
Use them all though, I think like if there was a little voice
00:12:42
agent or text box here where you could be like, you know,
00:12:45
accessing the deeper features of Riverside, then you might
00:12:48
actually. Use the beauty is suggesting
00:12:50
functionality that I wouldn't even necessarily have imagined.
00:12:53
You know, right? Right.
00:12:54
Yeah. I mean, presenting those
00:12:55
affordances on screen visually is still valuable.
00:12:58
We're not saying get rid of that.
00:13:00
We're just. Saying doing stuff is easier.
00:13:02
Isn't it a pretty big, big coincidence that like, we've
00:13:06
decided that text is a great way to like create products, just as
00:13:11
like that's the most straightforward way to build
00:13:13
them. I, I, I I don't know it's.
00:13:15
Not a coincidence. It's literally how humans
00:13:17
communicate, right? I mean, you use Slack, I know
00:13:19
you use Slack because we're on a slack together, right?
00:13:21
You text messages are saying. You text message.
00:13:23
Each other you're a writer your. Product.
00:13:25
Your product is words in a box. Literally your product.
00:13:29
Is an e-mail newsletter right? I know, I know, I know.
00:13:32
How many times? A day?
00:13:33
Do you use text in a box in some way or another?
00:13:36
Right? I've been seeing a resurgence of
00:13:38
that graphic of like Craigslist where?
00:13:43
Oh yeah, yeah, the the unbundling of Craigslist.
00:13:45
Bundling of Craigslist. My God, yeah.
00:13:48
Like and then you know, do we does is that gonna happen with
00:13:51
ChatGPT? I think that sort of gets to
00:13:53
some of the the questions that we're asking here.
00:13:56
I'm curious if do you guys have any ideas of like what do you
00:13:59
think would be unbundled from ChatGPT?
00:14:02
Is it gonna be like this therapist use case or like is it
00:14:06
just the most popular use cases or is it the most valuable use
00:14:11
case? Like what startups would you
00:14:12
actually build to unbundle ChatGPT?
00:14:15
That's a good question. I mean, obviously you would you
00:14:19
would want something that's reasonably popular.
00:14:21
I do think just off the top of my head, I think the therapy use
00:14:24
case is probably one that will be unbundled because I think
00:14:26
that there's probably going to be some sort of like AMA
00:14:29
compliant version of a GBT therapist at some point.
00:14:32
You know, some sort of like actual doctor endorsed or doctor
00:14:36
founded or you know, psychologist approved version of
00:14:40
the sort of GBT therapist. And I think that that will
00:14:42
probably unbundle some of that use case.
00:14:44
I do think like you and I are obviously sort of in the early
00:14:48
days on this, but we both have kids and I think that childhood
00:14:50
education use cases from GBT will definitely get unbundled in
00:14:53
some way. Where like a lot of parents are
00:14:55
talking to GBT now being like explain gamma rays and X-rays to
00:14:59
my kid because they just watched Incredible Hulk or whatever,
00:15:02
right? And there's like definitely some
00:15:03
sort of very focused educational, I think use case.
00:15:07
I just think every everything that's useful, everything that's
00:15:09
useful, either people will train it better or they will build a
00:15:15
product around build that's better.
00:15:17
And that they suggests sure, what features are hidden inside
00:15:20
of Chachi BT Instead of having this thing you need to pull out,
00:15:24
it will market to you. Oh, here are the different
00:15:26
things that you can do in this category.
00:15:28
And yeah, I mean, there's there's value in the marketing
00:15:31
alone, right? Because the marketing alone,
00:15:33
just to have somebody yelling at you, you can do this cool thing
00:15:36
here. You know, like I wouldn't have
00:15:37
thought about it unless you're telling me the problem is Chachi
00:15:40
BT you know, it's like, do anything that's that's too
00:15:43
broad. I do think we're talking about
00:15:44
separate things. I guess I would say like, I
00:15:46
think a lot of these unbundlings like James and I are doing an
00:15:50
unbundling of GBT where we're building, you know, Dungeons and
00:15:52
Dragons GBT rapper in some sense, right?
00:15:55
You still want the text input and the voice input to be like a
00:15:59
big way that people play the game, right?
00:16:01
You might want them to be able to click on things.
00:16:03
You want to be able to show characters on screen.
00:16:05
You want to be able to make cool.
00:16:07
Or the monsters. Environments.
00:16:08
D&D is an argument for the text box rather than the product, as
00:16:13
I'm sure I mean it. I'm just saying that I think
00:16:15
more, I think that's sort of the 1st order reaction and that then
00:16:18
it's like, oh, we need to get move.
00:16:21
But even with the educational use case, I still think you,
00:16:24
even if you have a bunch of stuff on screen, like my
00:16:26
daughter's learning to read right now, you want to show the
00:16:29
different sounds and how she's sounding them out.
00:16:31
And she has to be able to tap, you know, the UR versus the BA
00:16:34
or whatever. But like, you also want in a, in
00:16:37
a sane world, her to be able to say, you know, rat versus cat
00:16:40
and have it be like, yeah, you said rat.
00:16:42
That was correct, you know. So you know, and she can't type
00:16:46
yet. But there is obviously both text
00:16:48
input and touch input, which is kind of the point I was trying
00:16:51
to make up. Yeah, And I I like your overall
00:16:52
frame. The point is that ChatGPT has
00:16:54
unlocked what was a very natural way of communicating that we
00:16:58
couldn't with machines before. So right.
00:17:00
Yeah, it just unlocks a lot of new opportunities, more
00:17:03
exploration in a lot of ways, because you can just ask the
00:17:05
thing if it can do the thing that comes in your head versus
00:17:07
before you had to like dig in through menus, as James said.
00:17:10
Can I share, can you guys see? Yes, the a big use case map
00:17:14
here. I actually haven't seen this.
00:17:16
This is super interesting, right?
00:17:18
This is. Is this their agent guidebook?
00:17:20
This is how. They built their multi.
00:17:21
Agent recent system read this today actually, but.
00:17:24
Then excited, then at the bottom they said embedding plot showing
00:17:29
the most common ways people are using the research feature.
00:17:31
So they're not necessarily just generic clod, but there was some
00:17:35
really interesting stuff here like translating documents
00:17:38
between languages, developing advanced automated trading
00:17:42
systems and strategies, researching pharmacological and
00:17:46
performance handing substance innovations.
00:17:49
So I think to me, all of these are like good candidates, right,
00:17:53
For being kind of unbundled. Unbundled.
00:17:56
To spin it a different way, these are all valuable, wonky
00:18:00
things that companies have a lot of incentive to do the hard way.
00:18:03
Yes. Yes, and in some ways what
00:18:05
people aren't doing are the cool but chill consumer things that
00:18:09
aren't aren't worth figuring out how to do.
00:18:12
And there you need a company to really tell people, oh, you
00:18:15
should be doing this here. It's good for this thing you
00:18:17
haven't thought of. And like if it analyzed sports
00:18:20
betting odds gets its own little thing here like anything that
00:18:24
involves like potentially making money, obviously monetizing so.
00:18:28
Now I've discovered something fundamental about human beings
00:18:31
here, there. Yeah, I mean, I think that like
00:18:33
the sports betting odds is a good example because the people
00:18:36
who need that or want it, like really badly want it.
00:18:39
But it's it's objectively just not that large of an industry,
00:18:42
right? I mean, like sports betting is,
00:18:44
you know, there's lots of money in sports betting, don't get me
00:18:46
wrong, but it it's not like the size of like, you know,
00:18:50
enterprise software or whatever silly stuff we talk about on the
00:18:52
show. So I just think that it's an
00:18:54
interesting example of like the market size of sports betting
00:18:58
odds analysis is is probably not that big.
00:19:01
So maybe there's like, you know, one company that could it could
00:19:03
really execute well on that, but it's also possible that just
00:19:05
like individual betters will fine tune Claude and and, you
00:19:10
know, figure out how to develop cutting edge strategies.
00:19:12
I think Claude could be strong in all these.
00:19:15
All right, next theme Max, you're up.
00:19:17
Yeah, next theme a lot less sexy, but I do think something
00:19:21
that I keep thinking about with granola, which is one of our
00:19:24
speakers at the conference, which is a meeting notes
00:19:27
recording and summarization analysis tool.
00:19:30
Is this idea of like is there sort of Microsoft Office suite
00:19:36
for the AI era or or not right. And I and I look, I warned you
00:19:40
this wasn't a. Sexy topic.
00:19:42
Yeah, like, oh man. I'll try to.
00:19:46
I'm a consumer founder, so I'll try to make this is as fun and
00:19:48
interesting as possible. But we all grew up with Word,
00:19:51
Excel and PowerPoint and. There's some of the most
00:19:53
valuable things in the world, obviously.
00:19:55
Exactly. Outlook, right?
00:19:56
Yeah, I mean, they're literally worth like a trillion dollars if
00:19:59
you added them together, right? And they which is insane. 40
00:20:03
years of lock in for Microsoft, you know, and one of the most
00:20:06
durable businesses of all time. OK, so finally, for the first
00:20:10
time in our lives, I feel like people are starting to do
00:20:12
interesting things where they're taking a different layer of
00:20:16
work, right? And they're saying, hey, instead
00:20:19
of I'm going to do documents as sort of the fundamental layer of
00:20:21
work, I'm going to do like conversations and interactions
00:20:26
and sort of insights from, you know, from those meetings and
00:20:29
those discussions, right? And and I guess the question is,
00:20:31
yeah, like is there a different slice of work where AI can
00:20:36
capture, you know, every meeting you have, every e-mail you sent,
00:20:40
every time you talk to someone, every, you know, and and then
00:20:43
also suck in all these documents and kind of create like a higher
00:20:46
level abstraction of how we do work.
00:20:48
And I think that is kind of what granola is going for.
00:20:50
It's also what notions going for.
00:20:52
It's also what gleans going for. It's also what open AI is going
00:20:56
for, Right. That's what I was.
00:20:57
Going to say that's what Chachi PT should be, that I mean if
00:21:00
word is everything, right. I mean I'm in Chachi BT, I'm
00:21:05
doing the thinking. And if writing is sort of like
00:21:07
an output or a part of the thinking process, capturing some
00:21:11
of that in a document makes so much sense.
00:21:13
I I think I'm pretty bullish on open AI.
00:21:16
Open AI Yeah, Well, the interesting thing is James and I
00:21:19
went through a discovery process at our company where we were
00:21:21
like, hey, we need one of these tools that, like, sucks in
00:21:24
everything, every document that Volley has ever created or
00:21:27
interacted with. So Dropbox and all these Word
00:21:29
files and, you know, Notion notes we've written to each
00:21:33
other and and so on and so forth.
00:21:35
And we realized like, no one really has all the pieces of the
00:21:38
puzzle right now. Like Open AI just announced that
00:21:41
they're going to have almost everything, but they don't have
00:21:42
no. And we use Notion and then like
00:21:45
Glean has like almost everything, but they're missing
00:21:46
something else. And then, you know, like granola
00:21:48
is very early. They really just have meeting
00:21:50
notes. So like no ones put together the
00:21:53
sort of coherent like, hey, you only need to sign up for this,
00:21:56
which is sort of what Microsoft Office did in the good old days.
00:22:00
And so I think Opening Eye wants to do that and I think Notion
00:22:02
and Glean both to want to do that too.
00:22:04
And it's it's unclear to me who's going to get there first
00:22:06
because I think Opening Eyes got a lot of other stuff on their
00:22:08
plate. I was thinking this was going to
00:22:10
be one of my themes, but since we're talking about it, I do
00:22:13
think there's actually kind of that war kind of brewing of who
00:22:16
owns the context, right? Like there's a reason Notion
00:22:19
doesn't want to be able to be exported into ChatGPT, right?
00:22:22
There's also like this kind of counter force or counter trend
00:22:26
where everyone's trying to come up with like the protocol or, or
00:22:30
kind of, yeah, open source version of how you, how you
00:22:33
provide context into models like the model context protocol,
00:22:37
right. What are we calling this theme?
00:22:39
Well, the theme is the war for context or something.
00:22:41
Or open. Versus closed for the 8th.
00:22:44
Time. Yeah, true, I think Google are
00:22:48
the. I don't know if they've fixed
00:22:49
this recently, but when I tried to integrate Google Docs and
00:22:52
Drive and Gmail into Open AI a few weeks ago, it was like a 60
00:22:57
step process that also involved setting up a Google Cloud
00:23:01
project in order to expose the information and, and, and
00:23:06
opening it has a document of like how to do this and it's
00:23:08
it's literally like 60 steps. I want that so bad.
00:23:13
Google doesn't want this in ChatGPT.
00:23:15
Notion doesn't want it in ChatGPT.
00:23:17
At the same time, some companies like Linear are, you know,
00:23:21
intentionally leaning into MCP because they do kind of want you
00:23:24
to be able to edit and interact with your tasks through MCP.
00:23:29
Model context protocol. It's like.
00:23:31
The new way, a neutral way that everyone has like each other.
00:23:35
Yeah, sorry, I'm sure that wasn't for me.
00:23:38
That was for the poor listener. Who doesn't.
00:23:40
Share Share Share No, MCP will probably be something that comes
00:23:45
up a lot at the conference too. That's that's been a hot topic
00:23:48
for sure. Do you want to explain it for
00:23:50
for the audience, James? Sure, I guess was created by
00:23:53
Anthropic and it's designed to expose context to agents where
00:23:59
they are in operating, right? So if you think of like Cursor
00:24:04
has an MCP connectors, I could go build an MCP server myself
00:24:09
and put a bunch of context accessible to that MCP server.
00:24:12
I could expose it in Cursor. So now for example, all of our
00:24:16
Volley developers could have access to like a style guide or
00:24:20
a Volley recommended documentation or things like
00:24:23
that, right? So it's basically a way to share
00:24:25
data into these. LLLLLLP in and allows you to be
00:24:30
open source to to have an agent one place and have all your data
00:24:34
somewhere else. Yeah.
00:24:35
And it also starts to allow you to expose tools to those agents
00:24:40
to you, right. So your MCP server could expose
00:24:43
tools from linear, for example, that would allow the agent to go
00:24:46
update the ticket or close it out or kind of interact.
00:24:50
So yeah, I think it's just one agreed upon or neutral protocol,
00:24:56
right for for exposing this stuff to agents.
00:24:58
I mean, it does feel like everyone's kind of gotten wise
00:25:00
from the last few generations of these corporate battles over who
00:25:04
owns. Who owns?
00:25:05
We don't, we don't need antitrust.
00:25:07
We just need, you know, good competitive incentives.
00:25:10
Well, I feel like everyone's like, no, no, no, no, no, I'm
00:25:13
not going to give you my data. Why don't you give me your data?
00:25:16
And it's, and it's like, no, no, no, trust me, trust me, trust
00:25:20
me, trust me, trust me. Just give me your data and we'll
00:25:23
send you more traffic and it'll all be good.
00:25:26
And that's basically what happened.
00:25:27
You know what Google did and what Facebook did and whatever.
00:25:29
And so like, I think everyone's getting pretty smart now where
00:25:31
like Notion isn't sharing their data with, with open AI.
00:25:34
And to your point, it takes 60 steps to get Google Docs into
00:25:37
open AI now. And so like, everybody's like,
00:25:40
hmm, no, how about you give me what you know and I'll be the
00:25:43
one box to rule them all for everyone.
00:25:45
Right, all right, let's move on to the next team.
00:25:48
I'm only one one for three down. How many themes have we covered
00:25:52
here? One of mine was kind of vibe
00:25:53
coding, which we covered with the design thing, so I sort of
00:25:55
just have one more so it's. OK, you have one more, one of
00:25:58
two. More left Yeah, you go for it.
00:26:00
This one I didn't need to give a clever name.
00:26:01
The term itself is ominous, but surveillance state, I think, you
00:26:07
know, we've got we've got granola, which obviously
00:26:09
requires a level of and everybody's getting recorded.
00:26:13
So we have this question of we all are we all getting listened
00:26:15
to all the time? And then the other big one
00:26:18
that's relevant right now is self driving cars.
00:26:20
You know, I, I mean, I'm very hostile to Waymo is getting
00:26:24
attacked. But you did see, I think it was
00:26:27
Taylor Lorenz did an interesting YouTube video where she was
00:26:31
talking about it. I like self driving cars, but
00:26:34
they are recording protesters everywhere.
00:26:36
You know, it is just like another piece of the sort of
00:26:39
we're constantly being monitored.
00:26:41
I have never been. I'm not a big privacy person, so
00:26:45
it's not like I'm like up in arms about it, but I've always
00:26:48
not really cared about privacy. I'm as you guys know an over
00:26:51
share, but it's surprising to me that privacy is not a larger
00:26:55
part of the criticism of AI BS so much of this stuff requires
00:27:00
like ubiquitous surveillance. There's also that New York Times
00:27:05
court order that requires Open AI to store and save every chat
00:27:10
you've ever interacted with on the platform.
00:27:13
Which is pretty wild I think. I think it's a massive overreach
00:27:16
from the judge but whatever. Requiring them to store every
00:27:20
single thing everyone types into chat CBT for the purposes of the
00:27:22
lawsuit. I, I don't know, I think that's.
00:27:24
That's shit. I was just using chachi BT last
00:27:26
night to create Pokémon and I'm like, how is this legal?
00:27:29
Like not even new Pokémon, just like hey, make a make an image
00:27:34
of a Pokémon and it's like sure, happy to I know all the Pokémon.
00:27:37
You know, it's just like, isn't that the most blatant copyright?
00:27:41
I, I don't, I know you can draw, but like somebody should have
00:27:47
to, they should have to sort of keep all this stuff they're
00:27:49
creating because it's like, oh man, they're going to be
00:27:52
lawsuits for like ages. Well, that's fair.
00:27:56
Yeah. I think I'm more worried about,
00:27:59
like, the law enforcement angle to it.
00:28:02
Like will everyone who gets arrested have to turn over their
00:28:05
chat? GPT history or scared about
00:28:07
that? Yeah.
00:28:08
Was it was it Sam Altman? Somebody was suggesting a new
00:28:11
set of legal principles. Yeah.
00:28:12
Yeah. I think Sam Allman was saying it
00:28:14
needs to be like your therapist or your attorney, which I
00:28:17
totally agree with. I don't think it should be
00:28:18
unique to hey, I think it's insane that they can force you
00:28:22
to use your face to open an iPhone.
00:28:24
Like I think all that should be sacrificing.
00:28:27
I think Google history, right, is clearly subpoenaable or
00:28:31
something, right? Yeah, that's scary.
00:28:33
Yeah, I know. So should be able.
00:28:35
To try to figure out how to get away with murder, You know it's.
00:28:39
The only yes. We can't Google how are we
00:28:42
supposed to murder anybody? How to do insider?
00:28:44
How to do insider trading comes up a lot ugly.
00:28:47
Yeah, it's like what? I can't do anything right if our
00:28:55
brains are dependent on Google and an opening eye like they're
00:28:57
just literally gonna just stop murders.
00:29:00
It's. Not maybe we can now say, oh,
00:29:01
that was my agent. I was.
00:29:03
I wasn't telling it to, Yeah. Yeah, plausible deniability.
00:29:06
Yeah, that would actually be an interesting privacy strategy.
00:29:10
It's like, oh I create a bunch of weird shit all the time to
00:29:13
distract from my digital record of like, what is?
00:29:16
Actually my browser usage tool on ChatGPT access and then said
00:29:23
go wild. So then.
00:29:25
Please obscure the weird shit I do with other weird shit so
00:29:28
nobody can prove which ones are me and which ones are the agent.
00:29:30
That's that's interesting. Yeah.
00:29:32
I mean, as to your surveillance state point, I think there's
00:29:34
like 2 separate points, right? One is like having cameras all
00:29:37
around us all the time out in public, right, Which I feel like
00:29:40
most Western countries have already done anyway.
00:29:42
Like, I mean, sad to say, but almost in any like European
00:29:45
country or Asian country or much of the United States, if you're
00:29:48
in public, you're probably on camera already.
00:29:50
Like so I guess that doesn't stress me out that much that
00:29:53
there's even more cameras owned by Google.
00:29:55
Like I mean literally every single person has a phone with
00:29:58
three cameras on it in their pocket they can pull out at.
00:30:00
Any time Good. I mean, we had, we just had the
00:30:02
manhunt over this terrible Minnesota, right?
00:30:05
And it's like that guy's in the woods.
00:30:07
But it's still like, you've got to imagine that.
00:30:08
Yeah. Cameras everywhere going out.
00:30:10
So, like, I'm not saying there is an opportunity for like,
00:30:13
civil rights violations relations, obviously there is.
00:30:15
But I don't think it's been like this horrible thing we're all
00:30:17
being filmed all the time when we weren't, you know, 20 or 30
00:30:19
years ago. I do think the other one, the
00:30:21
interesting question is like, yeah, the work, you know,
00:30:23
listening to all your meetings, monitoring stuff that Granola
00:30:27
and a much of other people are doing.
00:30:28
Now there's a separate question as to like the modality.
00:30:31
Of that conversation is being recorded that we're having.
00:30:33
Problem. Oh yeah, exactly right.
00:30:35
Yeah, yeah. Take off.
00:30:36
Your tinfoil. Hand yeah, exactly.
00:30:38
I think there are people listening to us as we.
00:30:41
Speak yes so I have AI have AI have like sort of a hot take
00:30:45
that I'm not sure I 100% believe, but I will say which is
00:30:47
that, you know, athletes right from the moment they're like 5
00:30:51
years old at this point, have every single moment on the court
00:30:53
or on the field like films, right and then they go home and
00:30:57
you know when they're young, it's with their parents or with
00:30:59
their, you know coaches and they review the tape and then as they
00:31:01
get older, you know if they get into the NFL, they literally
00:31:04
spend like I think 90% of the workload of an NFL player is
00:31:07
like reviewing tape, right watching tape right and so I
00:31:11
sort of think of like maybe we're entering this world where
00:31:14
we all have a lot more like game tape, you know, of of how we're
00:31:18
acting in work settings and in particular, I think.
00:31:21
Somebody why sees like that moment, That's where you weren't
00:31:24
grinding hard enough. You should have been grinding
00:31:26
right there and you weren't, right?
00:31:28
Well, it's. Like, again, I just think the
00:31:30
athlete analogy is interesting, right?
00:31:32
Because no one's like, oh, we shouldn't like track how many
00:31:34
like points Kevin Durant scored, right?
00:31:36
Because like, it would be embarrassing if he had a bad
00:31:39
night, you know? And it's like, oh, we shouldn't
00:31:41
like have a stopwatch like on the track because like,
00:31:43
whatever, some of the kids are slow.
00:31:45
There's. Other kids about, you know, like
00:31:47
it's not, it's right. You're you're making a lot.
00:31:49
It has their jobs. Scrutinized as much it's a it's
00:31:52
a specific window. There was outrage from the
00:31:55
previous YC batch, right? There was that outrage over that
00:31:58
manufacturing startup that was monitoring by a video behavior.
00:32:02
They leaned into the they wanted, I think, some of the
00:32:04
backlash to get attention. I mean, they they film every
00:32:08
practice, so whatever. Let's not get too hung up on the
00:32:10
analogy. They film every practice.
00:32:11
Baseball players play 160 games a year.
00:32:13
Like whatever number you want to come up with in terms of how
00:32:15
much game tape is being produced, I think it's fairly
00:32:17
comparable to going to work. So yeah, I think it's an
00:32:20
interesting question. Like, is there a way to use it
00:32:22
in a way which like makes you more effective or more creative
00:32:26
or, you know, only do the parts of your job you're good at that
00:32:29
you like, and maybe, you know, other people get retest on the
00:32:32
parts you're not good at Or, you know, like, can we see more self
00:32:35
improvement in life if we're like actually tracking how we're
00:32:37
doing? And I agree, there's a ton of
00:32:39
like Big Brother crap where it's like, I don't want my boss like
00:32:42
watching every meeting and like judging every single remark I've
00:32:45
made. But like having quote UN quote
00:32:47
game tape for your job seems like it's going to be a part of
00:32:50
all of our lives whether we want it to or not.
00:32:53
I think people will want this in a way that acts as sort of like
00:32:57
a coach. So I think if you I'm not sure
00:32:59
about the actual mechanics of the the storage of the data and
00:33:04
whether your boss has access to it, but I think a lot of people
00:33:07
would want sort of this new, you know, like we all have ChatGPT
00:33:11
this sort of neutral party to, you know, receive advice from,
00:33:15
right? What if you can do that at work,
00:33:17
right, without it go kind of having to be a conversation with
00:33:20
your manager, right. So someone, someone's kind of
00:33:23
monitoring your work, an agent in this case, and it's able to
00:33:26
give you very clear, actionable feedback that doesn't come off
00:33:30
as impersonal because it's coming from your agent.
00:33:34
So I don't know, I think people might actually like that.
00:33:36
I mean, therapy is, you know, one of the most popular uses of
00:33:40
these chat bots. Yeah.
00:33:41
It used to be like, oh, somebody could gaslight you and you're
00:33:44
like, oh, maybe. But now you put the chat bot and
00:33:46
it's like, oh, there's a neutral party and it's like, no, I'm
00:33:49
getting gas lit. I'm the one who's getting
00:33:50
screwed over. Like the terrifying version of
00:33:53
this, I think to fit into surveillance state again, is
00:33:56
just like, just imagine, you know, things that matter for me,
00:34:00
right? Like my employees are reacting
00:34:02
to me or like I'm seen someone, it's all recorded.
00:34:06
And then I get the play by play afterwards from the AI.
00:34:08
And it's like, you know, that moment where their eyes darted
00:34:11
to the side, They thought you're bullshitting that.
00:34:13
You know, it's like if you have a really sort of a high EQ AI,
00:34:18
which seems very plausible, honestly, some of the readouts
00:34:21
on how you're being perceived could be pretty brutal.
00:34:25
You know, it's like there's a lot of social interaction that
00:34:28
depends on getting away with people being like, oh, yeah, if
00:34:31
you really scrutinize their face, you could have told, been
00:34:33
able to tell that that person thought you were an idiot right
00:34:36
there. But like best for everybody,
00:34:37
that was not communicated. And I do think the level of self
00:34:40
consciousness that this will induce if we all have glasses
00:34:43
that are reading everyone's faces and giving us live
00:34:45
feedback all the time. Like, right, that's subtle micro
00:34:48
expression that you're only supposed to sort of understand.
00:34:50
Now you're getting out of what it was meant to say.
00:34:53
That's like, I don't know, sounds bad and.
00:34:54
I think that granola, the story with granola makes us realize
00:34:59
whether or not we have glasses, these things will be just
00:35:01
running on people's computers, right?
00:35:03
So if any Zoom meeting, someone could be recording it without
00:35:07
really you knowing. Maybe they're not supposed to do
00:35:09
that, but that's clearly happening.
00:35:10
And then if any of these video learning models get to the point
00:35:14
where you can, like you said, you know, kind of just detect
00:35:16
facial expressions that'll just be running without you knowing
00:35:19
as well. So I don't even think it's a
00:35:20
risk just for glasses like it's a risk today with.
00:35:24
With because Zoom basically your computer is a giant glass.
00:35:28
Yeah, I mean it. Whatever.
00:35:29
It's a giant pair of eyeglasses over your meeting, so.
00:35:31
That's Kluwe, right? Kluwe is pitching that you cheat
00:35:34
on anything through your because you clearly, clearly right.
00:35:38
Clearly. I mean, the defense of granola
00:35:40
is that in some ways they they're not keeping the
00:35:42
recording. They're trying to like sort of
00:35:45
like, OK, let's kill some of the parts that people don't like.
00:35:48
You know, there are ways that we could, you know, tweak them so
00:35:51
that it's like just the level of social comfort and not too much.
00:35:55
But they're pumping the recording to, to either Sam
00:35:58
Altman or DeepMind or whatever. I mean, it's going to somebody
00:36:01
like, I mean, the recording's like, but like I.
00:36:03
Think the recording is kept? I don't think it's.
00:36:04
That's like saying Zoom is kept whether or not no, but what do
00:36:07
you know, it's. Kept, whether or not it's kept
00:36:09
or it's being used to train the next generation of models.
00:36:12
Like I just don't, I don't believe it's not going somewhere
00:36:14
where somebody's getting value out of it.
00:36:16
I'm not saying there's a pile of all my granola recordings
00:36:18
sitting somewhere, but like first of all, they let you chat
00:36:20
with the transcript so they store the entire transcript.
00:36:22
Right, right, right. Definitely.
00:36:23
There's a transcript there, you can see it yourself.
00:36:25
You're just saying the audio. I'm saying the audio.
00:36:28
I'm just saying there are ways that even what we're capable of
00:36:31
doing, you can build a product that's more appealing to people
00:36:33
that says, OK, we'll do this and not this.
00:36:36
It's like superhuman level. Yeah.
00:36:39
Yeah. I, I mean, I think, I think it's
00:36:41
going to be an arms race and we're all, you know, going to
00:36:43
keep escalating this thing. Yeah, I agree.
00:36:45
Basically, if you're probably recording me, then I should
00:36:48
probably be recording in case you sue me.
00:36:50
So I have totally a transcript too, you know.
00:36:53
Right. All right, next theme.
00:36:55
I have a last theme and you guys can decide if you think it's
00:36:57
interesting enough or not. We can just cut it, but I think
00:36:59
that. Will this one make the air?
00:37:01
Let's see, yeah. You know it, there's this sort
00:37:03
of brewing, Uber versus Waymo, Uber working with Waymo, right?
00:37:07
Like there's the class good old fashioned distribution versus
00:37:11
product technology, you know, showdown happening in all these
00:37:14
different areas, right? So Uber obviously has a massive
00:37:17
amount of distribution. Product versus distribution, is
00:37:19
that what we're calling? Product versus distribution.
00:37:21
Exactly. Yeah, right.
00:37:22
Product or, you know, technology versus distribution, whatever,
00:37:25
right. So like is it better to be Uber
00:37:27
and have spent 15 years getting a, you know, billion people
00:37:30
installing your app and knowing it's how you call a car?
00:37:33
Or is it better to be Waymo and have all the self driving
00:37:36
technology? Or is there some grand and you
00:37:38
know, pox pox Romana in the future in which they work
00:37:42
together and they both succeed, right?
00:37:44
I don't really know the answer. But I think the question.
00:37:46
Yeah, yeah, James, just give me. Just give me the straight.
00:37:48
Answer here. My current take is, at least in
00:37:51
San Francisco, you're seeing the Waymo success, right?
00:37:55
OK. Yeah, it's surpassed lift, I
00:37:57
think, and almost caught up to. That's what there's a number of
00:38:00
rides. It's a better experience in many
00:38:03
ways. I think people don't even know
00:38:05
how to explain why it's better, it's just that they like being
00:38:09
alone, maybe without a driver. I mean, I think Uber needs this
00:38:13
technology to get commoditized as quickly as possible.
00:38:16
It's like they're self driving cars in China.
00:38:18
They're partnering with Wave who were having, yeah, they need
00:38:21
enough self driving car companies that their
00:38:24
distribution because honestly it's like their distribution
00:38:26
will last unless there's this disruptive product that's
00:38:30
superior that people prefer. And so if there can be multiple
00:38:34
self driving car companies, then I think their distribution is
00:38:37
strong. But if they're competing in
00:38:39
these markets where there's only one self driving car company,
00:38:42
then I think it's hard. Well, maybe it'll be that
00:38:45
there's different companies that succeed in different regions,
00:38:47
right? And then you're like, I'm
00:38:49
traveling to New York. I don't want to have to like,
00:38:50
learn. Exactly.
00:38:51
I think number has that case that we're international.
00:38:54
Like do you want a different app for every city?
00:38:57
But I mean, that's that's kind of a big deal if Uber ends up
00:38:59
losing here, right, Which I know we're a long way away from.
00:39:01
But like, because so much of what the sort of received wisdom
00:39:05
of the startup world was was, hey, you know, distribution
00:39:08
matters more than products. You know, network effects are
00:39:10
extremely strong and take a long time to break.
00:39:13
Brand matters a lot. Habit formation matters a lot.
00:39:16
It's almost exactly what you're hearing Sam Alman say about
00:39:19
ChatGPT being more valuable than the underlying models because he
00:39:22
says, well, we have 500 million weekly users of ChatGPT, and
00:39:24
that's ultimately more important than who has the best model in a
00:39:27
given week. It's like literally exactly the
00:39:29
pitch I would make if I were Uber, right about why Uber is a
00:39:32
great business. It's because everybody already
00:39:33
knows Uber. They already know how to use it.
00:39:35
And like, ultimately all the underlying technology will
00:39:37
probably, you know, get commoditized and we'll kind of
00:39:39
hang in there. So like, maybe that is what will
00:39:42
happen and the technology will get commoditized and it will be
00:39:44
fine. But if not, I think a lot of
00:39:46
startup conventional wisdom, kind of like has been wrong over
00:39:50
the last 15 years or 20 years. I don't know.
00:39:52
I think this is a case where this would be considered sort of
00:39:55
separately from ChatGPT. Like this is more of a
00:39:57
sustaining innovation, right? Where it's like a better product
00:40:01
that maybe even is more expensive, not less expensive,
00:40:04
right? And it does the exact same thing
00:40:08
in a better way. So the only question is why does
00:40:11
the network effect not matter? But the network effect is mostly
00:40:14
about drivers. So the network effect doesn't
00:40:16
matter because you don't need drivers.
00:40:17
So I think it's a very straightforward.
00:40:20
Success case for Waymo if that plays out that way.
00:40:23
Sure, but is that a failure case for Uber or not?
00:40:25
I guess like. Well, yeah, I think that they
00:40:27
that's the failure case, right, Is they invest in the
00:40:30
technology, they just. Got they just.
00:40:31
Got beat on the exact. Product that they offer, right?
00:40:34
Yeah, Uber. Clearly needs non waymos to have
00:40:37
competitive self driving. I feel like that's
00:40:40
unquestionable. And I think the reality is China
00:40:44
is a strong proof point that that exists.
00:40:47
And so to me that's a signal that Waymo will face serious
00:40:51
competition, which I think will be good for Uber.
00:40:55
And I think it's gonna be hard for companies like like a like
00:40:57
Tesla gets more credit for self driving than Uber and they're
00:41:01
gonna have a hard time building this network.
00:41:03
And so there if I'm like, who's more over hyped on self driving,
00:41:08
Uber or Tesla? I definitely think.
00:41:10
It's Tesla. I guess when I actually think of
00:41:13
how it will play out and not like what's happening right now
00:41:16
in San Francisco, I think I have more faith in Uber because I do
00:41:21
believe that the technology will get commoditized.
00:41:23
Like I think, and it'll just take a while.
00:41:25
It'll take, you know, five years or something.
00:41:27
But I think there'll be like an aftermarket for this technology.
00:41:31
So then anyone can, you know, have it in their car and
00:41:36
suddenly anyone can add their car to the Uber fleet or
00:41:38
something, right? And it's just, that's just not,
00:41:40
it's gonna be how Waymo operates.
00:41:43
Even Tesla right there, they're more focused on selling their
00:41:45
brand of cars, right? So yeah, I kind of think once
00:41:48
this commoditizes, all of a sudden it's like then it's back
00:41:51
talking. But but it, it'll take a while.
00:41:54
So in the meantime, you're gonna see like certain cities that get
00:41:58
density of Tesla's or Waymo's, like just completely destroying
00:42:01
Uber in those regions. All right, I I have a last theme
00:42:06
sprinting until the end of history.
00:42:08
I I just feel like, you know, like if you look at coding, you
00:42:14
look at it and now we talk about self driving there.
00:42:16
There's so many of these things where it's like you can sort of
00:42:18
game out how the world could get revolutionized and it's almost
00:42:23
like demotivating. It's like, I mean, you know, you
00:42:25
have to build you the world keeps changing.
00:42:27
You have to build really fast with the idea that if you are in
00:42:30
the best position when this big inflection point comes, you'll,
00:42:36
you know, have distribution or you'll, you'll be the power
00:42:38
player and you'll be able to lock it in in perpetuity.
00:42:41
But in the moment, it's sort of the opposite.
00:42:43
It's like the technology is ubiquitous.
00:42:45
Everybody is very competitive with each other.
00:42:48
And so the only way that you stay ahead is that you keep
00:42:51
improving the product. And so it's this Sprint with
00:42:54
this desire like stability point.
00:42:58
Do you want your question next? It's not a question, it's a
00:43:01
theme. It's like you, you, the question
00:43:04
to you is for this to be falsifiable.
00:43:07
I guess I think we are in a moment now where there's intense
00:43:12
pressure to self disrupt to embrace the current technology,
00:43:16
right. Like Figma obviously is a great
00:43:18
example where it's like a pretty recent startup that feels a lot
00:43:22
of pressure to match the current generation of startups and that
00:43:27
basically and Notion and air table and like every generation
00:43:30
of still hip pretty recent startups feels like they need to
00:43:34
meet the current product inflection point in a case that
00:43:38
hasn't always been the case. And so it feels like, man, we
00:43:40
just did all the sprinting and now we're a big, pretty stable
00:43:43
company about to go public. Like, obviously you have to
00:43:46
continue to grow, but Airbnb wasn't totally asked to like
00:43:49
reinvent the wheel, you know, into the IPO.
00:43:53
But I I feel like it's this Sprint at the moment.
00:43:55
It kind of gets to James's earlier point about not to get
00:43:58
like too nerdy about this, but the disruption versus sustaining
00:44:01
innovations, right? The idea?
00:44:02
We actually don't know the difference.
00:44:04
So this is like, yeah, this is the class.
00:44:05
It's a Clayton Christensen Innovator's Dilemma book, which
00:44:09
was very popular like 10-15 years ago.
00:44:10
And now it seems to have gone out of fashion.
00:44:12
I feel like no one's really read it anymore, but we read it and
00:44:16
it's about how the there's two types of, you know, new
00:44:20
businesses or new innovations in a market, right?
00:44:23
One is sustaining, which is it makes the product better and
00:44:26
it's quickly adopted by the existing companies.
00:44:29
So kind of what you're describing, we're like Figma
00:44:31
sees AI and they're like, oh, we got to plug in these sort of
00:44:34
lovable style creation tools into Figma because it's going to
00:44:38
make our existing customer base more powerful.
00:44:41
And maybe we'll grab an adjacent customer base, which is this
00:44:44
vibe coding group that wants to, you know, just design things and
00:44:46
then put them into production. And so it's sort of like, I
00:44:48
think electric cars is a classic example where a lot of companies
00:44:51
are adopting electric because it is an innovation in a drivetrain
00:44:54
for cars. But ultimately you still have to
00:44:56
make cars and it doesn't change the way cars work or who you
00:45:01
sell them to. And the idea of disruption is
00:45:04
you might have an innovation that makes a product in existing
00:45:07
category worse, but that appeals to a completely different
00:45:11
audience, right. And I think frequently debated,
00:45:14
but one of the most popular disruption narratives of our
00:45:16
lifetime is, is the iPhone, which is it disrupted the PC
00:45:20
industry because what people use for computing before the iPhone
00:45:24
was, you know, computers that sat on their desk or desktops
00:45:26
and they did it a lot of serious work with it.
00:45:28
And the iPhone brought computing from something that maybe 500
00:45:31
million people had a computer to something that 5 billion people
00:45:35
or nearly every person on the planet.
00:45:37
And they do all these things that we would have called
00:45:39
computing on their iPhones, like editing photos and browsing the
00:45:43
web and, you know, sending e-mail and calling Ubers and so
00:45:46
on and so forth. And so that was like a
00:45:48
disruptive market opening up where there was 4 1/2 billion
00:45:52
people who didn't have a computer.
00:45:53
But then this new form factor and this new price point and
00:45:56
this new business model kind of opened up this big new market,
00:45:59
right? And I think that what you're
00:46:01
asking is like, hey, is this a sustaining innovation or
00:46:03
disruptive innovation, A lot of this AI stuff, like is this
00:46:05
going to create brand new audiences, brand new markets?
00:46:08
Or is this just going to make some of these existing products
00:46:10
better? I don't know.
00:46:11
James, does that summarize accurately?
00:46:13
To Eric's question that I feel from the founder perspective,
00:46:16
like a lot of founders are really energized over this,
00:46:18
right? You see, like people like Dylan
00:46:22
leaning in and wanting to ship new features because of just how
00:46:26
much better they can make Figma and how much more effective.
00:46:29
I agree with you. There's like probably a
00:46:32
capitalist imperative here or something.
00:46:35
But I also think, sure, yeah, I do see a lot of founders just
00:46:39
like excited about what's possible.
00:46:41
I think there's an additional element.
00:46:42
Here that companies that once felt like they were in different
00:46:45
categories feel like they're colliding, right.
00:46:48
We talked earlier you're a word processor now you're competing
00:46:51
with like the search disruptor. You know, it, it just feels like
00:46:54
every company could potentially compete with each other.
00:46:59
And so I think that's creating this sense that you can, it
00:47:02
wants to be a really buzzy startup worth endless amounts of
00:47:05
money. And you feel all the pressure on
00:47:08
all sides because you could get competed with from all these
00:47:11
different vectors. Yeah, there's like new fronts.
00:47:14
In every direction that you have to fight new, new startups, the
00:47:17
incumbents, the the growth companies off, you know, they're
00:47:21
all battling each other in new ways.
00:47:23
It's a little bit of like a a vibes.
00:47:25
Based assessment. But I think a lot of the time
00:47:26
when you see a disruptive innovation, it's disdained by
00:47:31
the existing players, disdained by the existing companies, they
00:47:35
consider it beneath them or sort of fundamentally sort of
00:47:39
condescend towards it. And I think that the like
00:47:42
Apple's attitude towards AI gives off a lot of disruptive
00:47:45
vibes where they sort of disdainfully talk about chat
00:47:48
bots and how nobody wants chat bots, you know, And Max, you
00:47:52
love Apple. You're, you have the deep.
00:47:54
Love for Apple, but you're like, I love Apple but I'm I, I call
00:47:58
it like I see it they. Were great.
00:48:00
Now, I think there you've got some questions there, but like,
00:48:01
yeah, I think that Apple's attitude towards AI reverse, so
00:48:04
goes the nation. You know, I think, I think
00:48:07
Apple's attitude towards. AI is disdainful right now.
00:48:10
That might change. They might have a leadership
00:48:12
transition in which they figure it out a little bit more.
00:48:14
But yeah, I think that that's sort of an interesting way to
00:48:17
think about it. And it seems like today in the
00:48:20
industry, there's just so much energy.
00:48:22
Whereas maybe if you look at, I don't know, the legal profession
00:48:25
would be an interesting disruption candidate.
00:48:27
Right. Like do the 70 year old partners
00:48:30
at Wachtel Lipton take AI seriously or are they disdainful
00:48:34
towards AI? Like my guess would be more on
00:48:36
the disdainful side. It's like writers or disdainful.
00:48:38
Yeah, yeah. I.
00:48:39
Think a lot of writers is Hollywood yeah exactly right
00:48:42
yeah yeah yeah so I. Love the mental models.
00:48:44
Just anyone who reacts too negatively to it, you're fucked.
00:48:47
And if you're like a signal, right, like?
00:48:51
A signal to check that it's not. It's not I.
00:48:54
I know. It's just like a warning sign.
00:48:56
Yeah, yeah. It's when you're.
00:48:57
It's when some people. Think something interesting is
00:48:59
happening, but then the existing incumbents think that it's
00:49:02
beneath them or are unworthy of consideration.
00:49:05
Right. And I think yeah, legal
00:49:06
Hollywood, some of the stuff writing we've talked about,
00:49:08
creative writing is an area where you you know, you probably
00:49:11
see real disruption. Great.
00:49:13
All right, everybody, let's just.
00:49:14
Review the themes so we have everybody can remember what we
00:49:18
covered here. James, you want to start us off
00:49:20
and review your themes? Sure.
00:49:22
I spoke about. Designers moving down the stack
00:49:26
toward engineering, both in terms of prototyping and
00:49:29
launching new internal tools. And I spoke about everyone
00:49:34
wanting to own context. The war for context.
00:49:37
It's just getting started. We're going to see lots of ways
00:49:41
for people to block context from going to other companies, and
00:49:45
lots of new ways to share context.
00:49:47
But what's the distinction you're drawing not to?
00:49:49
Rehash the whole thing, but yeah, context versus data, like
00:49:51
what's the hard line you draw there?
00:49:53
I just call it context because I think.
00:49:55
Agents and just really need context, right?
00:49:57
Like they want personal context about your cursor.
00:50:00
Agents want personal context about your code base and your
00:50:04
company style guides and who the engineers are and how you assign
00:50:09
tasks and you know, as as how you do documentation, right?
00:50:12
That's context. So there's a lot of context.
00:50:15
There's data in the world, right?
00:50:17
That's what we scraped to create these things like ChatGPT.
00:50:20
But then there's context that is like this particular user of an
00:50:23
agent or this particular type of task needs like more granular
00:50:28
context, such as ends up either going into the context window.
00:50:32
I mean that that's why they call it that, right?
00:50:33
It's like what's the most important data that to look at
00:50:36
while you're using an agent, right?
00:50:38
So I don't know work for context.
00:50:41
All right, Max, you want to review?
00:50:42
Your themes, yeah, so I had. I mean, along the lines of
00:50:46
James's design. Just discussing vibe coding.
00:50:48
How real, how big, how much of the future is vibe coded?
00:50:51
I had the Super sexy Microsoft Office suite battle for the AI
00:50:55
era. That's great, yeah.
00:50:57
Yeah, yeah, open versus. Closed all that good stuff.
00:51:00
And then finally, I think just the old distribution versus
00:51:02
product discussion, talking about Uber, Waymo, Wave, other
00:51:05
folks like that. And then I think, yeah, we got
00:51:08
into some disruption sustaining narratives, which I think you
00:51:11
hung on too. Yeah, mine I had.
00:51:14
Text box versus product this idea whether everybody should be
00:51:18
copying ChatGPT or that's just sort of an entry point into a
00:51:21
new technology that will return to well designed tools.
00:51:26
I I remember 11 comment. On that, by the way, is I
00:51:29
interviewed May Habib writer at our conference like 18 months
00:51:32
ago and she was like shitting on the and chatification of
00:51:35
everything as she called it. And like chat, chat boxes have
00:51:38
probably grown by a factor of like.
00:51:40
Exactly. That was just the beginning.
00:51:42
Since that point, theme 2 was the surveillance state.
00:51:45
Whether that's apps transcribing us or self driving cars
00:51:50
monitoring us, just the desire for data obviously requires a
00:51:55
level of surveillance that's new.
00:51:58
And then my sort of overly colorful third theme was
00:52:02
sprinting until the end of history.
00:52:04
This sense that companies face competition from every
00:52:08
direction. They have to keep moving.
00:52:10
And yet there's sort of this hoped for endpoint where the
00:52:14
technology is so transformative that whoever is in the strongest
00:52:17
position is going to have a sort of insurmountable advantage.
00:52:22
Sounds good. That could be Francis Fukuyama's
00:52:25
next book. I love it.
00:52:27
Yeah, exactly. I know.
00:52:28
The end of. History.
00:52:29
It's over. You great yeah.
00:52:31
This is fun. I'm.
00:52:32
Super excited. For the conference on June 25th,
00:52:34
we'll have videos out soon thereafter, your podcast feed
00:52:39
will get hit with some of our favorite ones and I'm sure we'll
00:52:43
ask far more wide-ranging questions.
00:52:46
I mean, we're always also interested in just how are these
00:52:48
people's companies doing and we have lots of infrastructure
00:52:51
people and people dealing with energy consumption.
00:52:53
And so there are certainly a couple panels that I can tell on
00:52:58
are on people's minds, but a lot to cover at the conference in
00:53:01
London. I'll see you guys in Europe.
00:53:03
See you soon. Excited see you there.
00:53:05
All right, bye. Thanks guys.
