'I Fell in Love With an Algorithm' (w/Cristóbal Valenzuela & Alexis Gay)
Newcomer PodJanuary 17, 202300:56:1338.61 MB

'I Fell in Love With an Algorithm' (w/Cristóbal Valenzuela & Alexis Gay)

Generative artificial intelligence is sweeping the nation. People are turning themselves into animated characters, drafting their essays with ChatGPT, and illustrating with Stable Diffusion. Or, as was the case with the tiny special effects team on the movie Everything Everywhere All at Once, they’re using it to help edit a movie.

On the latest episode of Dead Cat, Cristóbal Valenzuela, the chief executive officer at generative artificial intelligence company Runway, talked about how he discovered that his AI-powered video editing software was used to help make the award winning film.

When I wrote about generative AI burning white hot back in October, I talked to Valenzuela for that story and called him “among the most compelling founders that I’ve come across while reporting on artificial intelligence.” So I thought it would be fun to have him on the podcast and discuss some of the most pressing issues facing generative artificial intelligence.

To help me interview Valenzuela, I invited Non-Technical podcast host and viral comedian Alexis Gay to guest host the episode. You’ll probably recognize her from some of her viral tech parody videos. (Listen to my guest appearance on Non-Technical if you want to learn more than you ever thought you wanted to know about the man behind the newsletter.)

On the podcast, Valenzuela predicts that “very soon,” in “a couple of years,” artificial intelligence software will be able to create the sort of TikTok videos that people flip through online. “We’re heading towards a world — where a lot of the content that you consume online will be generated [by artificial intelligence],” Valenzuela said.

“There’s definitely an exponential progress rate that you can see and perceive more clearly now,” Valenzuela said. “What took years of progress is now taking months. And what used to take months is now taking weeks.”

Give it a listen

Read the automated transcript



Get full access to Newcomer at www.newcomer.co/subscribe

00:00:05
Welcome salad. Hey it's Eric newcomer, welcome

00:00:15
to dead cat. Another great episode.

00:00:17
I'm excited to invite two people onto the show.

00:00:21
We have a Lexus gay who is the host of the podcast

00:00:25
non-technical. And I went on her podcast and

00:00:28
we've stayed in touch and I Thought, maybe she would enjoy

00:00:31
coming on a play. The co-host role on a very sort

00:00:35
of technical podcast. So hey, Alexis Erik, thank you

00:00:39
so much for having me. I am so ready.

00:00:41
I'm ready to be technical. I know you.

00:00:44
You you've got metaphors Galore, you've been prepping so I'm

00:00:48
excited. Absolutely.

00:00:49
Alright and then the the one who gets to you know be subject to

00:00:54
the actual questioners here, we've got crystal ball,

00:00:57
Valenzuela the CEO of Runway who I actually encountered in one of

00:01:01
my early crazy generative AI hype cycles and I thought

00:01:05
normally, you know, I DM Founders just like am writing

00:01:08
about your company and, you know, they ghost me but you were

00:01:10
like oh yeah let's get on the phone and talk about it.

00:01:12
I thought we had a great call so then I thought you'd be a fun

00:01:16
guests to sort of explain. What exactly is going on in

00:01:20
generative AI. So, welcome to the podcast.

00:01:22
First, thank you for having me. And I'll do my best to try to

00:01:25
explain what's going on. It's crazy, right?

00:01:28
What? Just to start off.

00:01:29
I mean, can you You just give us some back.

00:01:31
Story of how you got into generative Ai and the artificial

00:01:36
intelligence world. Like how do you come to this

00:01:38
space in the first place? Yeah, for sure.

00:01:41
So, it all started back in like 2016 2015, I fall in love with

00:01:47
us algorithm and technique called Deep dream which is I

00:01:51
would consider like one of the earliest.

00:01:52
Like a I are kind of like models that was how there was very

00:01:56
neat. She was just like a handful of

00:01:58
Artie's and researchers were With it.

00:02:01
And I felt too much in love with it algorithm to the point, I

00:02:04
just left my job in Chile, got a scholarship, I didn't were you

00:02:07
and spent two years, researching kind of like, computational

00:02:11
creativity or applying like, AI models for Creative kind of like

00:02:15
purposes, and were closed and that's rabbit hole.

00:02:18
That's hard to get out of. How does one fall in love with

00:02:21
an algorithm? I was going to absolutely not an

00:02:24
experience I've ever had. No can't say the same, that's a

00:02:29
great question. I think for me was the aesthetic

00:02:31
quality in this study, kind of like possibilities that I was

00:02:34
saying, and it's kind of like earlier going them.

00:02:36
This is again before kind of the wave of outputs that you might

00:02:39
have seen today. It was more about that deep.

00:02:42
I would say two things. One I fall in love with the idea

00:02:45
of using neural network techniques to generate images to

00:02:49
generate video, right? To generate the kind of like

00:02:51
Explorer and type of aesthetic and creative possibilities.

00:02:54
Just wasn't like, even imaginable before a tree like

00:02:58
that idea of like exploring Uncharted.

00:03:00
It is unimaginable territories and I thought and I saw this

00:03:04
early kind of like researchers and our he's kind of like as a

00:03:08
way as a first kind of like step towards like just experimenting

00:03:12
with us and seeing how we could go.

00:03:14
I think I followed a model of that.

00:03:15
One single algorithm would be a video of experimenting with

00:03:18
technology, right of trying and use it in ways that perhaps

00:03:22
weren't meant to be used. Writing you go in a that's the

00:03:24
rabbit hole. I feel I fall in love with but

00:03:27
long story short, that one thing led to the other and Runway was

00:03:30
my to whether my co-founders kind of like a research project,

00:03:33
was my thesis at art school that I went to.

00:03:36
And the whole idea is basically still, it's basically the same

00:03:40
that we have today, which is how do you take these algorithms is

00:03:43
generative AI systems and use them in the context of

00:03:47
art-making in the context of filmmaking in the context of art

00:03:50
or design, just in the consists of creative kind of like

00:03:53
processor. And I think what we've seen so

00:03:56
far right now, like last year 2020 was like a A year where

00:04:01
we're models got really good. Right?

00:04:03
And for me, there's first way was perhaps 2015 to 2020 to

00:04:07
wear, like some techniques and algorithms, some researchers and

00:04:10
Artis were playing with 80 was mostly seen as a niche thing,

00:04:13
right? So now, when I remember pitching

00:04:15
like and telling people like, hey, here's the thing.

00:04:17
You can use to generate an image, the first reaction I got

00:04:20
was like, Chris is a toy, right? This is you being an art student

00:04:24
doing art gold. That this is just, I will never

00:04:27
use this, right? And I think that like this

00:04:29
missing tonight, You're like that happens often when it's

00:04:32
very early and I think the way that we see now which is I would

00:04:35
say 20, 22 onwards and we just started on the second wave is

00:04:39
models. Have gotten really good, right?

00:04:41
So no, my mom like asked me about like and I saw Seaside

00:04:45
understand what I'm doing, right?

00:04:46
We just like the outputs of them are just so much, crisp and

00:04:50
investable that. I think it changed the

00:04:51
understandings of what we can do with it.

00:04:53
You said, you went to art school?

00:04:55
I did. Yeah.

00:04:56
So did you spend a portion of your life considering yourself

00:04:59
an artist Do you currently I tried, I tried I did, they

00:05:04
started like econ and then like design and also like doubling

00:05:08
like software engineering and then spend time just in like art

00:05:12
and research. And I quickly realized that I

00:05:15
guess my I wasn't really a good artist depressed and you're

00:05:18
barred. No, I think my art, I always

00:05:22
find excuses to make tools to make art.

00:05:25
And then I realized that those tools were far more interesting

00:05:28
and useful that the outcomes of those.

00:05:30
Right. At least for me, I think I've

00:05:32
come to, I come to the realization that perhaps my art

00:05:35
is just to making any really like making something that then

00:05:39
a very talented artist can take and use in ways that I can't

00:05:42
even like, imagine, right. And I think that just, that's a

00:05:45
very kind of, like, interesting position debate and in terms of

00:05:48
our Runway, is specifically, like a video tool is that right?

00:05:52
Or am I thinking about it in a, to limited way?

00:05:54
I mean, it's helping introduce sort of these generative AI

00:05:58
tools into video making. Is that right?

00:06:00
Yeah, definitely. I mean Runway you can think

00:06:02
about as a company. First of all we are unemployed

00:06:06
AI research company. So we we develop and build kind

00:06:09
of like low level, the basic fundamental algorithms for

00:06:13
Content generation and it mean content because and work with

00:06:16
images with video, we just multimedia formats, right?

00:06:19
So we have a research team that sits at the core of the company

00:06:22
of a product that researching drives innovates and develops

00:06:27
novel techniques. For kind of like driving

00:06:29
generation of Content. And then we will products that

00:06:32
the leverage that sort of like algorithms and techniques.

00:06:35
And there's things you can do on the video side.

00:06:37
So we have films and post production, companies and

00:06:41
broadcasting companies, and Studios, and just teams in

00:06:43
journal, working on the video side of things leveraging and

00:06:46
using our kind of algorithms and products and we also have a set

00:06:50
of like image base Solutions and products as well.

00:06:53
Do you have a dolly or like a stable?

00:06:56
Do you have sort of an image Creator?

00:06:58
This public. Yeah.

00:06:59
But think of me, Stable diffusion, we run.

00:07:01
I know you're part of that. I'm going to talk about it.

00:07:03
But do you have your this Runway, the company have a

00:07:05
separate sort of image product? Yeah, we have our own 30

00:07:10
different tools, right? And every tool uses a different

00:07:14
algorithm where I could be nation of different algorithms

00:07:16
to either generate a video edit, a video generate an image or

00:07:20
translate from an image to a video, right?

00:07:22
And so what's interesting for us is that we're not for our

00:07:25
customers, we don't you get access to all these models

00:07:27
behind the scenes, but you are not necessarily Kind of like

00:07:30
aware of those models, right? So, from a filmmakers

00:07:34
perspective, if you're in our director of your designer, if

00:07:36
you're editing the video, you don't have any study in care or

00:07:40
have to know about, like the complicated like, ins and outs

00:07:43
of these models and the techniques and the names, and

00:07:45
all this in the same way that if you use Photoshop, you don't

00:07:48
really care about the blurring function or the algorithm that

00:07:50
allows you to blur an image. We just use the image right, you

00:07:53
just use the tool. So for us, it's that parameter,

00:07:56
it's like it's about human intention.

00:07:57
It's about you coming to the tool.

00:08:00
You working with this algorithms in very intuitive ways to the

00:08:04
pain. You forget.

00:08:04
You're working with this kind of like, quote, a I register you

00:08:07
work with that tool that helps. You achieve something bring

00:08:09
particular but behind the scenes.

00:08:11
Yes, we have like those models that basically just power the

00:08:14
whole set of features and products that we have my

00:08:16
understanding. Correct me if I'm wrong is that

00:08:19
sort of generative, a, I really blew up in the public

00:08:22
Consciousness. When stable, diffusion an open

00:08:25
source project with ties to stability AI a company.

00:08:30
You and your co-founder, I believe that Runway was one of

00:08:33
the head researchers on that project and then that project

00:08:37
became open to the public right before that, you know, these

00:08:41
projects had all been sort of behind the scenes open a.i.,

00:08:44
despite the name hadn't released, there's and then

00:08:48
stable, diffusion comes out, and sort of the world goes crazy,

00:08:50
open, a.i. puts out Dolly, am I getting that right?

00:08:54
And where do you sort of fit in to that project?

00:08:57
And how much are you involved with?

00:08:58
Yeah, I think. Actually you're right.

00:09:00
I think so. It might look like there's an

00:09:03
overnight success over a specific model that lets say the

00:09:06
space has gotten like so much attention.

00:09:07
The last couple of months right in might look like, oh something

00:09:10
happened in a specific, like they are mouth.

00:09:11
But the truth is that the research has been brewing for

00:09:15
like years, right? We working on runway for the

00:09:17
last four years and he takes a lot of like iterations on

00:09:20
learning and training and learning a lot from like what

00:09:23
works, what doesn't work, what data sets you can use what

00:09:26
models. Can you train?

00:09:27
What are the possibilities of that, right?

00:09:29
So the The actual code and model and paper that powers and the

00:09:34
fundamental architecture behind Civil Division was was released

00:09:37
actually like almost two years ago, right?

00:09:39
And that's open source. It's been open source.

00:09:40
Since we publish it that work is the collaboration of Runway and

00:09:45
Ellen do you mean which is a research organization in

00:09:47
Germany? And we do this a lot, we

00:09:48
collaborate with other researchers, the Patrick Sr,

00:09:51
who's our research, principal research scientist lead, those

00:09:54
efforts, right? And we publish that code, an

00:09:57
open source it for a conference that was like, released on 20

00:09:59
lat. 20 21. And then seems then that was

00:10:02
actually called late and efficient.

00:10:04
I was out late and efficient work, right?

00:10:05
It's diffusion is a technique and Layton is like a word to

00:10:08
describe like a the space of like possibilities in like am an

00:10:12
Nei so we basically call it later.

00:10:14
Diffusion, there's a technical term that's actually longer than

00:10:17
that. It's high resolution image

00:10:19
synthesis with late in diffusion models.

00:10:21
Sure slate and very good consumer facing so that yeah you

00:10:26
can see and then we treat him on the model we first.

00:10:30
In the first model. And then each rated on a more,

00:10:32
he took training on more data sets for longer and then a

00:10:35
company called stability, kind of like offer compute, basically

00:10:38
came in as an infrastructure and just say, hey guys, I saw your

00:10:41
open source project, I'll just throw in some computer and you

00:10:44
guys can train it on a larger data set.

00:10:46
And that's that basically was were simple, diffusion was born.

00:10:52
And then there was, you know, I love drama.

00:10:53
There was a little drama when you updated you guys updated

00:10:57
that software, right? And there was a little back and

00:10:59
forth, With about, whether you are allowed or not, or what was

00:11:02
that whole thing. Yeah, there was I think there

00:11:07
was some confusion to Venice again.

00:11:09
We published that we're going most two years ago and it's been

00:11:11
open source. We've been improving 8 and

00:11:14
somebody donated computer. And I think that had an internal

00:11:16
like confusion or conflict about like ownership that they kind of

00:11:21
like apologize about both publicly and privately.

00:11:23
And I think we just like steer and unnecessary conversation for

00:11:26
us, it's like we've been working on this for love a lot of time

00:11:29
for a lot of years. And we publish newspapers, we

00:11:32
have will continue to upgrade our During diffusion work and we

00:11:35
have already like new models that are going to like change

00:11:38
the landscape. If table division was like

00:11:40
inflection point like q1 if - this year's going to be like

00:11:44
probably ask you guys that one, but I think I'm blessed answer

00:11:47
but like the drama to be honest, I know people like like the

00:11:49
drama, but I think it's just like a instrument confusion.

00:11:51
I'm like, I'm glad that something like this.

00:11:54
Yeah, I'm glad I'm glad it's figure out like that.

00:11:57
I know. I think it's just like, God,

00:12:00
this Stakes here are just and this is classic, you know, we

00:12:03
experience An open source software projects before, but

00:12:06
it's who owns what and the stakes feel bigger with AI

00:12:11
because it's being presented. Is this sort of you know,

00:12:14
everybody's gonna lose their job or well you know like it's going

00:12:17
to change the world. And so I guess my question is

00:12:20
just like yeah like all these new you know, when when open

00:12:23
a.i. comes out with GPT, for when you update your software,

00:12:26
new versions of like Sable diffusion, how much of those

00:12:29
projects will be in the public domain where any Company can use

00:12:34
them and how much are they going to be owned by particular people

00:12:38
because my understanding with Runway was that?

00:12:40
Like part of how you were trying to differentiate was sort of

00:12:42
like the tooling and making sure you can apply the publicly

00:12:48
available. Artificial intelligence

00:12:50
technology to specific use cases or are you also trying to own

00:12:55
proprietary algorithms? I think every company might have

00:12:59
a different strategy and a different like Focus.

00:13:01
I think for us something with we've defined barrier.

00:13:03
On that has taken us some time to like, really nail down has

00:13:07
been disabled of owning your stack, right?

00:13:10
So we don't build our products on third-party like models for

00:13:14
software data sets, like we tried to build the whole thing

00:13:17
and to end. And the truth is that that's

00:13:20
that's hard because it requires you to really nail or understand

00:13:24
how to build research, how to build infrastructure, how to do

00:13:26
products, right? The advantage of that is you've

00:13:28
build that muscle and you trade enough is that you can change

00:13:31
any part of the stack, right? So if I want to change.

00:13:34
You want to improve a model because specific customer wants

00:13:37
a better version of whatever we have.

00:13:38
I don't have to ask someone else or I don't have to wait for

00:13:41
someone else to do it. I know how to do it and I can go

00:13:44
there and like do it, right? I can go to the lowest level

00:13:47
possible have a model is trained.

00:13:49
How mall is optimized other markets?

00:13:51
Deploy and then go back to the surface level and change that

00:13:54
for them? Right.

00:13:55
I think that's to your, my proprietary algorithms that

00:13:58
you've built. Yeah.

00:14:00
Of course we have with that, we have custom research and Open

00:14:04
source, some research. So he built our research

00:14:05
website. You'll see that the latent

00:14:07
efficient, the Civil Division work was something.

00:14:09
We open sirs. There's other work that we have

00:14:11
an open source yet will perhaps oppressors at some point.

00:14:14
It's like but it's a combination of both.

00:14:16
I think our open source strategy so we start to community

00:14:20
building like the the wave of creativity that emerge.

00:14:23
After releasing stable diffusion was just like really, really

00:14:27
incredible to see and that just helps Drive the field forward

00:14:30
which is exactly what I think happened.

00:14:32
There was like an inflection point It's a great great of

00:14:35
hiring as well. You attract talent and your

00:14:37
truck researchers who want to work with you and like, people

00:14:39
read the papers and people read the author's, and you go back

00:14:41
and kind of like, we get a lot of, like, really tiny delegation

00:14:45
from that. But we're not our kind of like,

00:14:47
approach perhaps different from other companies is that we're

00:14:49
not an open search company. Like, our product is not open

00:14:52
sourcing products, right? Or products to help creatives.

00:14:56
You use these techniques in comprehensive ways in their

00:14:58
day-to-day lives crystal ball. I have some questions about

00:15:01
that. So one good thing.

00:15:04
I'm curious about you just use the term wave of creativity said

00:15:07
after these tools came out, there was a wave of creativity.

00:15:10
I largely saw on Twitter, people typing like bird flying over a

00:15:17
truck, that's on fire and also elon's driving the truck.

00:15:21
You know what I mean? So I my question is when you say

00:15:23
wave of creativity using these tools, like can you help us

00:15:26
understand a little bit about what kinds of things were

00:15:28
happening? Using the tools?

00:15:30
Yeah, for sure. I think every new technology and

00:15:34
Enables us to think about our creative in different ways,

00:15:37
right? But like the ability for us to

00:15:38
type something and generated image, just like it's novel,

00:15:41
it's new, its we've never seen it before, like some people were

00:15:44
experimenting with it 10 years ago but now it's just you open a

00:15:47
browser and just type something, you get something out, right?

00:15:50
And I feel like there's a lot of times where dislike novelty

00:15:54
component of it is just trying new things.

00:15:56
You just tried to we're distant possible, reduced, Sherman,

00:15:59
alexie you weren't exactly what you're saying.

00:16:01
You want to see how far you can go, right?

00:16:04
Andre, like that's like a Curiosity Company.

00:16:07
Like, everyone is trying to break that systems.

00:16:09
Like, okay. And when I like really try

00:16:10
something like really hard, right?

00:16:12
And I think that's like when you realize you're dreaming and

00:16:15
you're like, oh my God, I'm Dreaming right now.

00:16:17
What should I try? And I think that's the, that's

00:16:21
the 100, a good that sticks fermentation finger to curiosity

00:16:24
face. You're like Justin, just okay,

00:16:26
there's a experimenting with it, right?

00:16:28
You're like this is new. This is different.

00:16:30
I want to understand that. What do you mean?

00:16:32
But I think true creativity. Edie comes after that, right?

00:16:35
After you've like, okay, that's has settled down a little bit

00:16:39
and you start experimenting with seriously, right?

00:16:41
You start to think about it. How you might incorporate it in

00:16:44
like a filmmaking process, right?

00:16:46
And if you talk to filmmaker for an art, director is not about

00:16:48
the craziest like prompt that you can generate.

00:16:51
Something is a lot about like control.

00:16:53
Like, I want to generate a variation of an image.

00:16:55
I want to have I want to change the color or the laugh and a

00:16:58
quarter over the right. How do I do it at scale?

00:17:00
Can I do it? Like in 1, like iterations

00:17:02
per second like Those are different kind of like

00:17:05
components. And for me, something I hear a

00:17:08
lot about is this idea that AI or like the systems are

00:17:11
automating creativity. Right?

00:17:13
I think for me, so curious about that but I'm a comedian by the

00:17:16
way for contacts. I'm a comedian and I have a

00:17:18
podcast. But prior to that, I worked in

00:17:20
tech for seven years, just like a classic career path, as, you

00:17:24
know. But anyway, so I have a lot of

00:17:25
fun. I have a lot of thoughts and

00:17:28
questions about this idea of a, I supported creativity or a I

00:17:32
supported during I say content or a I supported art and all

00:17:37
kinds of implications of that and so I would love to hear a

00:17:39
little bit more. How much riding Alexis have you

00:17:42
tried to write jokes with, okay, so here I tried to what did I

00:17:47
travel? I saw the, I saw a i generated

00:17:51
versions of my own tweets, that's what I saw using some

00:17:55
software that someone sent me, which I'll have to look up the

00:17:57
name and it was terrifying because they sounded like me,

00:18:02
mine would be like, weird. Our fights and like, I would

00:18:05
reside article just what I need to look at it and I generated

00:18:09
version of Arabic a tweet, just be a bunch of fake Scoops.

00:18:12
I need to look. Yeah, that's so good.

00:18:15
Yeah, so I love this idea though.

00:18:16
That it's a phase you're saying right now, or maybe perhaps when

00:18:19
it first came out, we were in the Curiosity phase and that

00:18:22
implies, you said true creativity.

00:18:23
Comes after that. So what do you think that means?

00:18:26
I think for me, creativity is a state of mind.

00:18:30
It's a way of looking at the world, right?

00:18:31
It's not at all. It's not our process, right?

00:18:34
And I think what you can truly automate it is the processes of

00:18:37
executing a creative idea, right?

00:18:39
I think you can you can go back to like like previous moment in

00:18:42
time or like there was a major technology like change and

00:18:46
understand the changes that that transfer that broad to

00:18:49
creativity, right? So I always go back to like one

00:18:52
of the earliest examples of technology changing art and

00:18:55
creativity and how we look at the world, which was before,

00:18:58
like, the 1700s painting was like, mostly the realm and the

00:19:02
space and the possibilities. Realities of like folks who have

00:19:05
the money, the resources and the time and the knowledge to create

00:19:09
these very complex pigments, right?

00:19:11
And creating a pigment was a very sophisticated thing.

00:19:13
It has like this history and tradition, you had to like her.

00:19:16
These Masters that were very expensive and you have to like,

00:19:18
mix these very obscure colors, right?

00:19:21
And then storing the pigments with like, you had to use a pig

00:19:23
bladder and Link silly with a string.

00:19:25
Grant is very like obscure, right?

00:19:28
And then 1800's, someone basically, John Brown's inventor

00:19:32
came up with the collapsible pain too.

00:19:34
Right. Just that's the piece of

00:19:35
technology, right? You can take colors and put them

00:19:38
in like a pain tube. And the most important thing is

00:19:42
that you can take that paint tube out into the world, right?

00:19:45
And so impressionism was born, right?

00:19:47
You had people going outside and having like canvases like small

00:19:50
canvases, you don't require like a whole studio to just paint,

00:19:54
right? And so what that meant is that

00:19:56
you start looking at the world in a very thing way, people

00:19:58
started painting know just like portraits of like the royalty

00:20:01
but like mountains and trees and like so.

00:20:04
She'll like Advanced, right? And like go this career TV to

00:20:07
like the merchant. That's what I mean by that.

00:20:09
Gertie the opportunity that's like your that's a piece of

00:20:11
technology that enables you to look at the world and express

00:20:15
your view of the world in a very different way that everything

00:20:18
you've used before, right? And fast forward, it's like that

00:20:21
the naval huge revolution in art and like the transfer ever

00:20:24
transform, how we think about painting art, it set up, right?

00:20:26
And you can think about very similar moments in time where

00:20:29
technology is like like that have changed, right art and

00:20:32
creativity and Photography is another one right photography

00:20:36
change. We also get the world and a lot

00:20:38
of the times early on the further, graffiti kind of like

00:20:41
wave. People said that like

00:20:44
photography was kind of like the and of hard, right?

00:20:46
You, it's gonna be the end of like paintings, right?

00:20:49
Like you, I never just like, and the truth is, it's not.

00:20:51
It's just a different medium, right?

00:20:53
Just need to explore that beat him in a way.

00:20:55
There's new artists that will emerge from that.

00:20:57
New ideas will come after that, right?

00:20:59
And you open the door for something that's the business is

00:21:03
very hard to. Predict, right?

00:21:04
Because you tried to predict something.

00:21:06
You haven't even yet, experimented with red.

00:21:09
And it's very hard to like make this set of assumptions over

00:21:11
time about everything that needs to happen.

00:21:13
I think I come to embrace that like uncertainty, as in like

00:21:17
early on something that's going to massively transform, a lot of

00:21:20
things, and we should embrace this like just experimentation

00:21:23
phase like that Curiosity face, right?

00:21:25
And the kind of things that you'll build after that, as yet

00:21:28
still to be like uncover and discovered and that for me is

00:21:30
really, really exciting. I mean I have such conflicting.

00:21:34
In thoughts on this, like, on the one hand so many like

00:21:37
amateur visual artists to me, like, what they enjoy is like

00:21:42
the physical process over the output, right?

00:21:46
And even if they can get better outputs from a sort of an AI

00:21:51
system is that cutting into some of the enjoyment, I get, the

00:21:55
digital are also in photography and everything is cut into that,

00:21:58
and people get their choices on the other hand.

00:22:01
I'm like, so excited by the idea of like Pairing the human mind

00:22:06
with like another sort of intelligence or I mean there's

00:22:09
such there. I feel like you know, Alexis is

00:22:13
wearing a chest sweater. It's like if you get good enough

00:22:15
at chess you feel like a certain connect.

00:22:18
You know, you can use these Bots to like, figure out how you

00:22:21
should play better. There they're like these, I feel

00:22:23
like they're already experienced this today where you can sort of

00:22:26
feel in sync with like a robot system or I play a lot of bridge

00:22:30
and my partner, and Bridge online is is a robot, I sort of

00:22:34
and what they do. And so having sort of this

00:22:36
partnership with some sort of intelligence is, you know, a

00:22:41
very Charming experience. But yeah, it's definitely a

00:22:46
terrifying future. Are you worried about it at all?

00:22:48
Like, do you ever wear? It's just inevitable to you.

00:22:53
Yeah, I'm not worried a lot. I think actually, it's a very

00:22:55
exciting time to be in a very optimistic about it.

00:22:57
I think like radical technologists.

00:22:59
Like, this are, like, one of the things I did do is I deleted the

00:23:03
cause And enable things to be more convenient, more easy to

00:23:06
use, right? So think about like filmmaking,

00:23:09
the people who had access to have, like, professional fee,

00:23:10
making tools, even nowadays is like a handful of people that

00:23:13
know how to like a date on professional software, right?

00:23:16
And I will dislike complicated. VFX work flows and need to

00:23:19
understand computer graphics and a bunch of things picture.

00:23:22
Having everyone with like ilm or like, bfx professional like

00:23:27
skills, right? It's a wave of like,

00:23:29
storytelling in ways that we haven't even thought of before.

00:23:33
It's just like right around. The corner?

00:23:34
What kind of tools we need for that?

00:23:36
Well, let's just build them, I don't know yet.

00:23:38
I'm more excited. I feel like more excited about

00:23:40
the outcomes of those and enabling more people to express

00:23:43
themselves in creative ways. And I think that's ultimately

00:23:45
the role of Technology like, at large.

00:23:47
I think I'm less interested in like me, save you again, that

00:23:51
you will automate it, and that we come like, some sort of like,

00:23:53
one click of solution and like everything will become the same.

00:23:56
I think that's just boring and like, it doesn't really like

00:23:59
interesting yet, to be honest. And it's also like a partial

00:24:02
view of how the world actually works.

00:24:03
I think. I think there's always this

00:24:05
human component on that that what gives us like meeting at,

00:24:08
Alexis you're saying like, we someone just generates tweets

00:24:10
that are, like, just actually there's a I've heard there's a

00:24:13
generator podcast, right? So you can generate the script

00:24:16
and you can generate the voice. And you have those Supply

00:24:18
conversation between like Steve Jobs and like Lex Friedman,

00:24:21
right? This is M.

00:24:22
Everything is generated, right? And if an interesting it's like,

00:24:25
yeah, I mean, works very well, you're convinced halfway through

00:24:27
this is actually happening, but I feel like there's still a lot

00:24:30
of potential of using that in way, more creative ways and just

00:24:33
generating the Entire thing, like n to n, right?

00:24:35
I hear what you're saying, and I agree with a lot of at night,

00:24:38
especially the point about the way, access to pieces of

00:24:42
Technology can really draw out. Someone's creativity.

00:24:45
I think my personal experience is a good example of that where

00:24:49
I was able to teach myself to edit video because I movie came

00:24:53
on my Mac, right? And then I know how to use

00:24:55
YouTube and then I graduated to other software but it was all

00:24:58
accessible, it was all just downloading thing.

00:25:00
Look at it. How to do this some

00:25:02
twenty-year-old, dunas basement told me on the And there's a

00:25:04
great I know how to make videos now.

00:25:06
But here's what's different to me about generative Ai.

00:25:10
And some of the other tools that have preceded it and assisting

00:25:13
creativity, which is that generative AI models.

00:25:16
And please correct me if I'm using the wrong terminology,

00:25:18
they have to be trained on something, right?

00:25:21
And so when we're talking about visual art or God forbid comedy

00:25:26
or anything else, do we have to feed it, existing artists work

00:25:31
in order to teach it how to do it and does not raise him.

00:25:34
Implications around ownership credit and profit participation

00:25:37
of the output of the AI. Yeah, totally.

00:25:39
I think those are very valid questions and things that are

00:25:42
really worth and fasting ourselves as we develop these

00:25:45
new technology. And and again, I'll go back to

00:25:46
like premiums in time. There's this video and this

00:25:49
recording of 35 years ago. Same questions being asked when

00:25:53
Photoshop was first released, right?

00:25:55
See ya. It's the same conversation older

00:25:57
show you guys the same condition, right?

00:25:58
Was original. I was like, this is an

00:26:01
interesting question. But it's but it's a valid

00:26:05
question. It's part of the commits

00:26:06
perfect. And there's, yeah, it is right?

00:26:08
And and and the interesting part is like they were discussing of

00:26:11
the time. It's like, are you allowed to

00:26:13
edit a photograph, right? A photograph is like the truth,

00:26:17
right? You can edit it, right?

00:26:18
You can't go into National Geographic.

00:26:20
Going to like TV magazine and edit any much, right?

00:26:22
And the half of the panel was like, it's just a little, this

00:26:25
technology should be banned, right?

00:26:26
And it can imagine now, like not having like a tool like

00:26:29
Photoshop in our disposal. So obvious, and like, Photoshop

00:26:32
became a bird. Right.

00:26:34
And so for sure there's a lot of things that you need to figure

00:26:36
out like Photoshop had to figure out a lot of things and kind of

00:26:39
like concerns about how to use the technology.

00:26:42
People were just need to like for like fake bills from the

00:26:45
like, create fake money, right? So if you try to open a photo

00:26:48
shot bill in Photoshop, Photoshop will like prevent you

00:26:50
from doing that, right? You can't just like it's your

00:26:53
battery. And so you build you build over

00:26:56
time ways of like, secure in the system, right?

00:26:59
But the 99% of the outcomes that are emerging from this They're

00:27:03
going to be net, positive are going to make Society new for

00:27:06
99%. I'm sorry.

00:27:08
I just want to go back. So you I'm just you on the

00:27:11
record just said that 99% of the outcomes.

00:27:14
That's a, don't you guys go? I think that's just a Boulevard

00:27:20
but I think. Yeah.

00:27:21
Most of the I would say that old put that on a slide somewhere.

00:27:25
Oh, of course. I mean, it's a, it's a very

00:27:28
powerful technology. It's going to be as massive as

00:27:30
cell phones were like. I feel like mobile was like,

00:27:33
Really net positive for everyone, right?

00:27:35
Like you're able to hold like a lot of knowledge in your pocket

00:27:39
and able to connect with anyone in the world, right?

00:27:41
That's just think that's not positive.

00:27:43
I think a eyes and gently vir similar Technologies, right?

00:27:46
Your there's of course like people they're going to try,

00:27:48
they're bad people in the world and then you try to use

00:27:50
everything they kind to do bad things for sure and like will be

00:27:53
contrary to prevent those but overall the outcomes of the

00:27:56
Technologies net positive like well we're allowed to

00:27:59
democratization of content. For example, as you were saying

00:28:02
of anyone having Access to the sings just like the to their

00:28:05
magical change. Yeah but I think to your point

00:28:07
there is there's a lot of conversation that needs to be

00:28:09
had and people are having around like data set and training the

00:28:13
models and one which data is and like how you do it.

00:28:15
But again like Google had the same kind of like discussions

00:28:19
early on right? I don't want my data to be

00:28:21
constrained by by Google, right. Kind of have and kind of have

00:28:25
like a waste opting out if you don't have one.

00:28:27
I have my website being like searchable and the answer is it

00:28:30
you can like just put this file on your website and I Google not

00:28:34
scrape. It and I would not index you in

00:28:36
the search file in the search kind of like options.

00:28:39
And so there's I think there's still a lot of things yet to be

00:28:41
uncovered and models are not 100% like ready yet to be like

00:28:46
using all sort of like professional environments were

00:28:49
still very very early. How important do you think the

00:28:51
humans are to generative AI or will be in the next couple

00:28:55
years? What we're talking about is

00:28:58
human sort of as creators experimenting using this tool,

00:29:02
but I another conversation On this podcast, we've talked about

00:29:05
Tick-Tock and you know, how sort of the content sorting algorithm

00:29:09
is so good. And, you know, there was this

00:29:12
question asked of like, how far away are we from Tick-Tock?

00:29:15
Just saying, oh, you know, not only do I can guess which videos

00:29:20
that exists do people want. But I can just create the video

00:29:23
that you are likely to want. Like, how far away do you think

00:29:26
that world is where somebody like Tick Tock, can just deliver

00:29:29
the video that I want without a human involved at all over.

00:29:33
I'm very, very close. I think we're not far away from

00:29:36
having laughing. Good, I'm not like, Oh, yay.

00:29:42
You saying that makes me scared. That does not sound fun to you.

00:29:46
Sir we I don't want to commit to specific data or moment but I

00:29:50
think like we're heading towards her world in general.

00:29:53
When a lord of the content of your consumer line will be

00:29:55
generated picture like a YouTube, like generated like

00:29:58
stream, right? You can do Netflix generated

00:30:00
content like Tick-Tock generated content, right?

00:30:02
I think that's Somehow visible like today, if you combine a

00:30:06
bunch of different things but there's that's still a lot of

00:30:08
things to be developed to get to a point where like you'll do it

00:30:10
real time. I think it's under happened, if

00:30:13
it works, if it's possible. Any further comments and allow

00:30:17
us to explore new avenues of like creativity.

00:30:19
I think that's that's going to be great.

00:30:21
I think a lot has to do yet with developing this bottles in more

00:30:24
safe, Miners and more kind of like a line ways to our human

00:30:28
intention. And there's a lot of work to be

00:30:30
done still yet there to prevent possible.

00:30:32
Misuse is for sure. Sure, but overall, I think we're

00:30:35
going to be in a time in moment, where like you're going to be in

00:30:38
every single movie you ever wanted to be right, you are

00:30:40
going to be, you're going to eat them.

00:30:42
A nice thing about it is not only that it can create

00:30:45
something but it can create something specific to me that I

00:30:49
will. Yeah, and compelling.

00:30:51
Yeah, 100% 100%. And but and we you think this is

00:30:54
like three laying out like, yeah.

00:30:56
What's your soon? I'm like which how do I need to

00:31:00
prepare my job security? Like what does my timeline Need

00:31:03
to be to learn to code. I just need a sense for when I'm

00:31:06
going to be at replaced by G, PT 3.

00:31:09
I think very soon, a couple of years.

00:31:11
I would say yeah that's good. Your therefore there's

00:31:15
definitely like an exponential like like progress tray that you

00:31:20
can see and perceive more clearly.

00:31:21
Now that like here's the progress of what Take Years with

00:31:25
two gears, for progress is now taking like months, right?

00:31:28
Yeah. And we'll what will start to

00:31:30
take months with Senators start to take like weeks, right?

00:31:32
And you go by. That like all the time like

00:31:35
large language models and now going to be writing like papers,

00:31:38
I'm going to be writing code, I'm going to be and you started

00:31:40
accumulating. The amount of progress that you

00:31:42
can make, right? And I think that's yet, it's

00:31:44
happening and it will continue to have him, but I think that's

00:31:46
going to happen like sooner than we think.

00:31:47
I mean I respect you for doing your best.

00:31:50
Not to give us a date because you know I covered the

00:31:52
self-driving car industry. Any of that was an artificial

00:31:56
intelligence powered industry. That always said you know full

00:32:00
autonomy is just over the horizon.

00:32:02
Like what do we take From that experience.

00:32:04
Why is this different than that? I mean, things.

00:32:07
People have gotten excited about chatbots before things can

00:32:09
appear, you know, if you solve 90% of the problem, but you need

00:32:13
to solve the whole hundred percent, it can be easy to

00:32:16
convince yourself that you're like almost there.

00:32:18
But the last ten percent can be can be extremely difficult to

00:32:21
Impossible. I think it depends on what your

00:32:24
goal is and what you're trying to accomplish writings I guess

00:32:26
in the case of self-driving cars like full autonomy, right, can

00:32:29
you drive a car in any Street in the world and like with no?

00:32:33
Get right? That's that's the end goal right

00:32:36
for us. It runways not.

00:32:37
I can you make a piece of art or can you make a video with no

00:32:40
human intention and no control at all?

00:32:42
Right, that don't think that that's not the goal.

00:32:43
Their goal is like, the goal is like, can you take an idea and

00:32:48
execute that idea in the fastest way possible, right?

00:32:51
And right now if you want to create a video like Alexis you

00:32:54
want to make a video 20 30 years ago you would have taken your

00:32:58
like months, right? You got to like rental equipment

00:33:01
and then like and have someone have been?

00:33:03
Cost prohibitive as well as yeah.

00:33:05
It's too expensive, right? And so your idea is like, I want

00:33:07
to make comedy. Want to make this, you should

00:33:09
you hand. I mean, usually it's when I take

00:33:11
you months and you need to find a producer.

00:33:12
I like all this. Yeah.

00:33:13
And the fact it's like technology and like smartphones

00:33:16
on the internet. God you like 90% there, there's

00:33:18
still, of course work. You have to do like coordinating

00:33:21
on recording and editing, Etc, but like it's made, like 90% of

00:33:25
the work, right? You were like, so it's so much

00:33:27
better so much visible even to even consider doing something

00:33:30
like that. So if the goal for us is like

00:33:33
to, Help create. It's just like, get over that

00:33:36
like 80 90 percent of the work that you don't want to do as a

00:33:38
creative. Maybe just like no one wants to

00:33:40
like spend time. For example, searching through

00:33:43
hundreds of videos and then copying those videos and then

00:33:45
placing them to express the idea that you want to have in your

00:33:48
head. It would be great if you can

00:33:50
just like how to make that, right?

00:33:52
And that's like that's 90% of the process, right?

00:33:54
And I think that the goal for us is you measure by that?

00:33:57
I think we're very close to getting to that point, right?

00:33:59
And that's the ultimate goal. I would serve dirty eyes like

00:34:02
taking the cause of Ten creation down to zero, not the cause of

00:34:05
like, ideas, right? IDs are still like ideas and

00:34:08
like the best ideas will still win, right?

00:34:10
I think this is the time where everyone will become more of an

00:34:14
editor and curator and like AV has the best ideas will be

00:34:17
executed, not because they have the funding on the resources

00:34:20
just because they're the best ideas.

00:34:22
I will bet we have to we're playing sort of the cynics even

00:34:24
though I'm extremely excited about this technology but you're

00:34:27
doing a good job of making the bull case.

00:34:29
Do you worry that we're going to have even more of like a

00:34:32
bullshit? Mmm, in the world of social

00:34:34
media, there's already been the problem.

00:34:36
You know, if you have a big account, you have people writing

00:34:38
to you and just the work of like, sorting through among

00:34:41
humans. What's like a reasonable

00:34:43
critique to spend time an electrically and engage with and

00:34:46
what sort of a waste of your time is a taxing exercise.

00:34:49
Even if you're like, sort of a with it person and then if you

00:34:53
add to the mix, I literally on Twitter.

00:34:55
Had someone, I'm 90% sure, use GPT as like an auto response to

00:35:00
a bunch of tweets to like, juice engagement.

00:35:03
I responded like GPT and they send me a smiley face and it was

00:35:07
obvious to me. But it's still like, oh my God.

00:35:10
If I'm gonna run into that all the time where I'm going to have

00:35:12
to engage with the cognitive task of is this gbt.

00:35:15
And of course, the problem with it being GPT is G, PT or

00:35:19
whatever text text tool, they're sort of bullshitters.

00:35:22
They're very they're 100% confidence and maybe 90%

00:35:26
accurate and that's very challenging for human beings.

00:35:29
Yes. What do you say to the

00:35:31
bullshitter problem of AI? That's a good question.

00:35:35
I think. Overall I'm optimistic that that

00:35:37
we get solved in the same way that like, spam and like

00:35:40
scammers got solving like mail. And in the Internet it's like if

00:35:45
you can receive an email from like, a prince of Nigeria,

00:35:47
you're like, yeah, this is, this is right?

00:35:50
And this is like a very common like thinning internet.

00:35:52
I like you build filters and you've been making its into the

00:35:54
tech bad and like people lab figure out and governors.

00:35:58
Like there's ways of like improving those systems to like

00:36:01
avoid that right now. What's happening Bending, is

00:36:03
that it's still very early, right?

00:36:05
Like, if you consider like chop, GPD tragedy was like a research

00:36:09
release, right? It wasn't like a product, right?

00:36:11
And it was just like, right the day when taken by surprise how

00:36:13
popular was a, all right? Yeah, it in the bottom.

00:36:17
It says like this is our research released, right?

00:36:18
There's a lot of things that you have in yet consider, but just

00:36:22
trying to learn how people use it, the possible, like things

00:36:25
that need to be fixed. And I think that a big

00:36:27
misconception is to look at this and be like, okay this this is

00:36:29
it this is the final thing will use ever and it doesn't work.

00:36:33
Right? I'm like, no, it's just like a

00:36:34
point in progress and time and I will continue to improve it

00:36:38
until our point where like, all that bullshit that's like you

00:36:41
will find and they say things will be like, either reduce or

00:36:44
remove or like prevent it and still a lot to be to be done for

00:36:48
sure. I think that there is cool

00:36:50
stuff. I agree with a lot of what you

00:36:52
said around taking the burden of some of the more like task task

00:36:58
pieces of the creative development process.

00:37:00
I think that there's a lot a lot that could be good there.

00:37:03
I'm just scared. That's why I'm scared at all.

00:37:08
We've talked a little bit about like, oh, eventually people will

00:37:11
figure this out, and I'm sure that like things will happen

00:37:14
down the line. I'm kind of like this seems to

00:37:16
be going awfully fast, and I don't know.

00:37:19
Who is ultimately responsible? Like who's the adult in the room

00:37:21
here? That supposed to put a hand up

00:37:23
if you like, we should probably have some rules socialist

00:37:26
realism to it. I think so much of what happens

00:37:28
is just if it's possible, it will happen and that's how I

00:37:31
yeah I think sort of Of assumption.

00:37:34
Underlying some of this conversation.

00:37:35
Is that like the learn to code crowd is going to win the word

00:37:40
cells. Writers Among Us are the ones

00:37:42
who are going to be screwed. Do you actually agree with that

00:37:45
premise? I mean, we're seeing these like

00:37:48
programs, do some coding or like, is it clear to you like

00:37:52
which human skill sets? Come out on top.

00:37:55
If this all happens, is you see it?

00:37:58
And which ones are the losers? Yeah, that's a good question.

00:38:01
I think I don't have a full answer.

00:38:03
Whereas because I think a lot has the yet be like, discover,

00:38:06
and like understood for like World models can actually go.

00:38:10
I think, with every kind of like previous moment in time in the

00:38:13
past for something, like I said happen.

00:38:15
Like there's like jobs and disciplines unlike things to

00:38:19
just disappear, right? Like you don't need the the

00:38:22
market for like people could do analog film editing has like

00:38:26
possibly being reduced, right? No one is just cutting films

00:38:29
like with Caesars anymore, right?

00:38:30
You can't find these people, right?

00:38:32
And the reason is that those folks have to adjust their jobs

00:38:35
and like, there's has two for a diesel age and they have to

00:38:38
learn a new task and understand the limits, and the kind of like

00:38:41
all directions of using that I think.

00:38:44
Overall, my hunch that we'll start to see that like

00:38:46
professions and jobs will radically change right software

00:38:50
engineer for, I think will become more about again, having

00:38:52
an idea and helping a system like execute the idea in a

00:38:55
secure way, right? So if you want to write right

00:38:58
now you can just like something like copilot and just write a

00:39:01
function and have like the copilot like Complete the whole

00:39:04
function for you, right? So you're not going into the

00:39:06
Commendation to try to understand apis and then going

00:39:09
back just like it works really really well, right.

00:39:11
But you still need to give guidance and you need to the

00:39:13
human need to understand your intention, right?

00:39:15
And you're like having the system just help you along the

00:39:18
way on that. Right?

00:39:19
So I think that that not sure like software Engineers are like

00:39:23
job itself is like not. It's still going to be here,

00:39:26
right? Just radically going to be

00:39:27
different. It's going to be sort of

00:39:28
function differently. It's gonna skills that might

00:39:31
require going to be different. You might not need to No, this

00:39:33
very obscure like the levels of, like, API documentation because

00:39:37
now you have something that can do it for you.

00:39:39
And for writers, I think it's the same, it's like pretty,

00:39:40
like, think about like, how like computer.

00:39:43
So you can typewriters change. The process of writing, you are

00:39:46
able to like Grace things easier, they're able to like

00:39:49
compose ideas more quickly. You're able to Like Remix thing

00:39:51
because you have them saved in different ways, right?

00:39:53
I think that just changes the nature of like writing and cool

00:39:56
so you can work. And right if I'm not an English

00:39:59
like, native speaker, like this algorithms can help me like

00:40:02
significant, Writing skills. And now I kind of have a

00:40:05
conversation with someone else in another language that I don't

00:40:07
have and you can be exposed to because I don't have and that

00:40:09
can kick-start something out. So I think it's more interesting

00:40:12
to have a like, a very open mind and very kind of like learning

00:40:15
mentality of like, okay. How is my job or they think I

00:40:18
do? The augmented and how we need to

00:40:20
be transform and change. And I think there's like that

00:40:24
the hesitation on the, the word conflict component of is very

00:40:26
natural like silent film was, like, the thing for 20 years

00:40:31
like in the early 1900's, right? And when Audio came to be a

00:40:35
feasible technology to have in cinemas.

00:40:38
The first reaction from like everyone in the Hollywood

00:40:40
industry is like we need to ban audio, right?

00:40:42
This is going to destroy the like and this is true like

00:40:45
Charles shut not want to be on that side.

00:40:48
Are we still recording? Charles Chaplin was like binding

00:40:54
audio like I will show you. I the adage is like the

00:40:56
association for like film and audio in like the musicians from

00:41:00
like Hollywood are bonding audio from movies, right?

00:41:03
Because Who's gonna pay for like the orchestra's that are in the

00:41:05
theaters, right? And of course, like something

00:41:09
happening for the, for the group.

00:41:11
A couple things you're saying that stick out to me, are one

00:41:15
there's potential here that the proliferation of these tools

00:41:18
that let's just focus on writing, like, help you write

00:41:21
could actually raise the bar. For what type of writing were

00:41:24
willing to an interested in consuming, because if everybody

00:41:27
can, let's say in a year quote, right?

00:41:30
Like to be T3, the bar is going to be Much higher for your

00:41:34
average, I think, article, or newsletter, or whatever because

00:41:37
you're going to have to be bringing something really

00:41:39
special to the table in order for people to spend their time,

00:41:41
reading it. Yeah, 100% maybe that's just me

00:41:43
trying to be I just want you to know I've sort of been in

00:41:48
Silicon Valley long enough and watch the government be slow

00:41:51
that. I'm I'm like whatever is

00:41:52
possible will happen and sort of nothing will put the cat back in

00:41:56
the bag. So worried about it too much is

00:41:59
you know, it's good to think through these things, but but

00:42:01
yeah, the technology is going to progress.

00:42:03
Yes. But I do think it's an open

00:42:05
question. You know what this does to

00:42:08
literacy and like all of a sudden people are just like

00:42:10
Outsourcing you know their professional writing.

00:42:14
You know, most writing is bad, most radius formulaic.

00:42:17
Yeah. But people learning to think is

00:42:20
important, I mean, I feel like there's been a lot of chatter

00:42:23
about the sort of, I don't know, High School, cheating problem.

00:42:26
I literally see it on teacher subreddits, you know, worrying

00:42:29
about the cheating problem. I don't know.

00:42:32
Are you worried about Out it, like, one solution.

00:42:34
I was saying is I assume the AI systems will just get better and

00:42:38
better over time. So to become easier and easier

00:42:40
to identify old cheaters. So, it seems like a big risk to

00:42:45
cheat when we're going to be able to figure it out, cheaters

00:42:47
more. I don't know.

00:42:48
Do you worry about that? I mean, I don't worry about it

00:42:52
too much. I think like it might help us

00:42:54
like, really understand like from perhaps from are more first

00:42:57
principle like thinking perspective, like while they go

00:43:00
love, like schools are like, it's not to memorize like words

00:43:02
like you need to Have critical thinking and like I'll learn

00:43:05
you, I'll teach you how to think and like do the world and

00:43:08
process it and come up with ideas, right?

00:43:10
And that's what we'll go look. Like, going to school to like

00:43:13
anything. Right?

00:43:13
If you want to learn something, it's not about memorizing stuff.

00:43:16
It's not about managing understanding how to use a

00:43:19
complex process and our workflow or whatever it is about.

00:43:22
Like look at the wall in a critical way having like a way

00:43:26
of understanding things, right? And I think if like if I learn

00:43:29
how to code with something like a language model that can Teach

00:43:33
me constantly that can fit me, that give me examples and, and

00:43:36
they can do a good software idea or I can execute something by

00:43:40
doing that. That's fantastic.

00:43:41
Right. Did I have to go through like a

00:43:43
formal training of reading this like books?

00:43:46
And this traditional process? Maybe it didn't, then the didn't

00:43:49
matter, right? I think like that might happen

00:43:51
and of course, it's normal. There's like push back at the

00:43:54
beginning, right? Because it's just it's

00:43:55
different. Right?

00:43:56
It's changed. And we're humiliating researcher

00:43:58
on dates. Yeah, we're very reluctant to

00:44:01
change. Like we're very reluctant to

00:44:03
change. And then the thing is like we

00:44:04
get used to change a lot. I was actually trying one of the

00:44:06
self-driving cars UNICEF a couple of weeks ago and the

00:44:10
first like ten seconds is just very scary.

00:44:13
You're like I want to get out just like yeah yeah just like

00:44:15
where's the driver? Like and then after a minute

00:44:18
you're like okay I'll just take my phone and just relax, right?

00:44:20
It's right for a mini relate, very reluctant to change and

00:44:23
then you've tried it and then you kind of like a Sumo K, their

00:44:26
works. And in like doesn't like crash.

00:44:28
I'm not dying. It can move on.

00:44:29
And like, it works really well and the human psychological

00:44:32
Journey on I'm embracing new technologies, a little sad.

00:44:35
Right? When something is exciting, were

00:44:37
very fearful about it and then like you're saying, we just sort

00:44:40
of embrace it. And then now, you know, when I'm

00:44:43
10 minutes late, because of the subway, I'm like Furious, even

00:44:46
though, you know, prior world's, we're getting across New York

00:44:49
City, could have taken me like days or, right?

00:44:52
And required a horse, right. Right there weren't two Bridges

00:44:55
to do it all, you know, like, so it's sad how much we take

00:44:59
existing technology for granted and then spend all our time.

00:45:03
Session over, what's right over the horizon.

00:45:05
But right that's that's Humanity.

00:45:07
I agree. Your optimism for this.

00:45:09
Technology is truly contagious because I really am.

00:45:11
I am thinking about it in less scary - cynical ways.

00:45:15
I'm thinking about it in more positive ways.

00:45:17
As what could it do to like you, use the word augment or Eric.

00:45:20
Maybe you did. Like, how can we look at these

00:45:22
as tools that will help augment creativity and not replace it?

00:45:26
Yeah, in terms of particular companies, obviously you're

00:45:29
excited about your own but what are The project sort of using

00:45:35
generative AI that you're most excited about right now or what

00:45:38
do you think people should be watching for?

00:45:41
Did a lot of things I've seen? There are some I think there's

00:45:44
some movie particular that I think for me, exemplifies, a lot

00:45:47
of other will continue to Syria over the next couple of years.

00:45:49
I don't know, you guys to watch it.

00:45:51
Everything everywhere all at once I love love.

00:45:55
Yeah. Okay.

00:45:55
They're all kind of kind of I know.

00:46:02
So there. Beautiful movie, right?

00:46:04
He's having watching the you're listening to these specific.

00:46:05
Whoa, watch it. It's like phenomenal movie,

00:46:07
right? And I think are quicker.

00:46:08
I seriously got the million guys.

00:46:09
I don't know if you guys remember is it has a lot of

00:46:11
visual effects, right? It's very intense usually thing.

00:46:13
That's right. The most interesting fact for

00:46:16
me, when I learned is and it was just like a Spanish about, it

00:46:19
was like the majority of those effects were made by five

00:46:23
people. Five people.

00:46:25
Right. Just five people, the didn't.

00:46:28
That's you think that didn't have previous experience

00:46:30
building like feature film complexities and Future?

00:46:33
Film effects like that were able to put together that kind of

00:46:37
level of like quality and like they're like very talented

00:46:39
people for sure, right. But it just fight people that

00:46:42
used to take hundreds, if not thousands of like wounds and

00:46:46
when I did, I was like wow you just like this is just insane

00:46:49
who are these people these are like super humans, right?

00:46:51
What's going on? I searched for those five folks,

00:46:53
a search for the like MDB. Provide proof of the movie, got

00:46:57
the five, kind of pfx, people behind the movie.

00:47:00
And, of course, like wanting to run away and see if they

00:47:03
Delegate counts, right? And all of them had accounts,

00:47:06
right? As I, that's phenomenal ways to

00:47:09
reach out to of all of them is like, yeah, we'll chat and we

00:47:12
chatted and they eat. They used to use parts of Runway

00:47:15
to edit a movie, right? It was just a small part, but

00:47:18
yeah, very validating. Yeah.

00:47:22
And and then my essentials like, well, if y wanted to like, I

00:47:26
guess what, I had the inside of like, I will search for them in

00:47:28
our database is because if you think about creating a movie

00:47:30
like that, the complexities of the movie already, That's right.

00:47:34
But you to be able to execute that you had to automate it, you

00:47:37
have to simplify it in a way, right?

00:47:39
And my assumptions like maybe they came across the runway

00:47:41
because it basically what we try to offer, right?

00:47:44
And so they did and we wrote a case study about it.

00:47:46
Now, some of them are already using it for other phones.

00:47:48
I'll send you the case study. And I think like, what I'm

00:47:50
really excited about is those kind of things right?

00:47:53
There are very highly creative teams with a lot of motivation.

00:47:56
A lot of, like, great to just understand how to get their

00:47:59
ideas out without having to have this budget restrictions or

00:48:04
There are no other set of limitations than just execute it

00:48:07
and it worked. It worked really well.

00:48:08
It's like the blue is going to probably going to win like a

00:48:10
dozen of dozens of awards. And so the journey would say,

00:48:13
I'm really excited about those teams that are embracing

00:48:16
technology like this that are like, experimenting with it,

00:48:19
that are using that are trying to push it in ways that I

00:48:21
haven't even thought of like, no one else I thought.

00:48:23
It's just like, you need to, like, expose it to more people

00:48:26
and I'm also excited about if you think about that, same

00:48:29
approach taken to product building, right?

00:48:31
So I don't have the research in the Space has been mostly led by

00:48:34
like domain experts. Right?

00:48:36
The researchers were like very deep into the way.

00:48:38
It's like phds who are like incentives to like make

00:48:41
benchmarks in X Y, or Z metric better.

00:48:44
But I think the outcomes and the results of using these models

00:48:47
are going to impact way more than the researchers, right?

00:48:50
And the insights are going to come from like multidisciplinary

00:48:53
teams. You need hackers in the Artis

00:48:54
need like more people jump in here.

00:48:56
So the companies that are more excited or the products and the

00:48:59
products, I'm excited about our products that can blend those

00:49:02
things. So you have We go, folks.

00:49:03
But I think that you have artists like speaking and seated

00:49:06
at the same table and be like, okay, here's what's possible,

00:49:09
Right? Would that be useful without the

00:49:11
not useful? How I'm going to how can I help

00:49:13
you out meant something, right? I think that's the second way

00:49:16
that I think I was telling you guys are early on like 20.

00:49:18
First wave of a I was pretty like these new wave of ideas

00:49:21
have been out. There have been a few

00:49:22
historically but 2015 to 2020. One was the first one in 2022

00:49:26
onwards. The next one is going to be

00:49:27
about figuring out how to collaborate with humans, right?

00:49:32
And how to take this algorithm From research, domains and silos

00:49:36
into real world examples. But part of what I'm taking you

00:49:38
to say is that, you know, there's a lot of emphasis on the

00:49:43
generalized intelligence and part of what you're saying is

00:49:46
that actually you need people focusing on specific use cases

00:49:51
and how they play out or is that the right contrast to be zeroing

00:49:55
in on. Yeah, I care more about humans

00:49:59
and I think they'll adjust has been the purpose of humans,

00:50:01
right? So we're helping you You to

00:50:03
something I think when we might lose that like, narrative or the

00:50:06
other direction, when you think too much about the analogy for

00:50:09
the sake of technology and that's the technology as a way

00:50:11
of serving humans in a particular way.

00:50:13
Wow, I know a lot of people who could hear that.

00:50:20
Not gonna say who this is a safe space.

00:50:24
I guess you can do it super safe space.

00:50:28
Do you mean like open-air? I mean, they're obviously the

00:50:31
elephant in the room Microsoft is reportedly like can invest

00:50:35
ten billion dollars. I mean, are they arrival of

00:50:38
yours is their success good or bad or neutral for you or like

00:50:41
what's your view on open eye and how do they relate to what

00:50:44
you're doing? I think opening, ISO research

00:50:48
driven, Station right there mostly about like, pushing the

00:50:51
limits of like research, right? And they've done an incredible

00:50:54
job at that. They've released multiple like

00:50:55
breakthroughs over the last 10 years, right?

00:50:58
They started much more focus on reinforcement learning and then

00:51:01
move because they discover new things and that's just how

00:51:03
research works. And I think that there are

00:51:05
pretty interesting unlike organization.

00:51:08
That's done, really breakthroughs in research, but I

00:51:11
think research is not enough for products.

00:51:13
If you think about products products about again, helping

00:51:16
serve human needs and helping People to achieve or summer

00:51:20
problem, specific way and part of it.

00:51:23
In this case, it's research because you can build the

00:51:25
fundamental Technologies but part of it is also.

00:51:27
How do people interact with it, right?

00:51:29
And there's a lot more that has to be built around it.

00:51:32
I think I don't think I was open as a competitor to the NS.

00:51:34
I think they're just research organization that's been do.

00:51:37
Use their apis or anything, we don't know.

00:51:40
No product. We build our own everybody, not

00:51:42
only might at some point. I'm sure, you know, more about

00:51:45
this than I do. But like, Jasper, whatever

00:51:46
superhot company in the tech space was built.

00:51:49
On top of open a.i. and then you know open it I comes out with

00:51:54
this chat gbt that ends up competing with if I don't write.

00:51:58
Am I crazy? I mean there are lots of

00:51:59
questions about like in terms of companies who's actually going

00:52:03
to be able to win sort of the war here.

00:52:06
I mean our bad is, you need to own your stack, you need to own

00:52:10
your technology, right? If you're building something you

00:52:12
needed to be able to like understand every piece of it.

00:52:15
Right? Because if not, someone else can

00:52:17
just take it off your right and four.

00:52:19
First kind of like Focus as a company has been just building

00:52:22
the full stack component, right? That's very interesting in a

00:52:25
way, it's sort of like asking the question is generative AI

00:52:29
more comparable to the technology that is the

00:52:32
smartphone or is it more similar to an operating system in that?

00:52:37
Will it matter? You know what I mean by that?

00:52:39
Like, is it, is it the technology that things will be

00:52:41
built on top of like, in the case of Runway, or are?

00:52:44
We talking, which is more, like, I guess you could consider it

00:52:47
like more commodified and less like Like owned or are we

00:52:50
talking about it as like oh it's equivalent to like a fully owned

00:52:54
and operated operating system or something like that?

00:52:57
I think it's pretty big. My hunch is more close to a

00:52:59
smartphone to be honest. Like, when you have I mean,

00:53:01
Apple has radically made that like the case, right?

00:53:05
You've you own your Hardware, you build everything from the

00:53:08
harbor itself to the software to the platforms where people build

00:53:12
on top, right? And that's a way stronger

00:53:14
business case or a business like argument than having like slide.

00:53:19
The gracious mostly because again, technology moves really

00:53:21
fast, right? And so you can consider anything

00:53:24
or you can give anything for granted, if things are moving so

00:53:26
fast and they're becoming obsolete so fast, we can let you

00:53:30
go or Alexis you have any final thoughts?

00:53:33
No, I think I really I really grown on this podcast.

00:53:38
I'm glad what do you think? I think cuz I think I started a

00:53:41
little more fearful and now I think I'm more open minded.

00:53:44
And also I am terrified of the fact that 20 years from now

00:53:48
we're going to listen Back to this and I don't want to sound

00:53:51
like the person that got interviewed by the internet in

00:53:53
the 90s and was like You guys heard of this internet thing.

00:53:55
Thanks. I'm really trying not to be.

00:53:58
I've been clear in my I think it's huge.

00:54:00
I mean I was super skeptical of self-driving cars because they

00:54:03
need to be complete. Whereas this side you know, it's

00:54:06
about the human interaction. I'm already seen you know like

00:54:10
professional type. People generating questions and

00:54:14
drafting emails and I feel like even if you're plenty smart to

00:54:18
write a good piece of It's just like solving it.

00:54:21
And then the point that you've made sort of very clear on this

00:54:23
podcast, that nothing is more valuable than what a technology

00:54:27
saves human beings time. And so, even if you don't view

00:54:30
it as like, okay, it's going to be more creative than us in

00:54:33
certain ways. It's like, if it's saving us

00:54:35
time to do creative tasks, that's sort of the core value of

00:54:39
technology and people will always want to save time.

00:54:42
So I'm, I'm super bullish on that.

00:54:44
I I still think. I mean we saw like lenses there,

00:54:47
you know, there's still a hype cycle going.

00:54:49
I-i'm. No, I mean, people get so

00:54:51
excited, something it's cool and then it falls off.

00:54:54
So it's about sustaining that enthusiasm.

00:54:58
Yeah, I agree that it's sustaining.

00:55:00
The work is sustaining do to set some underdevelopment more than

00:55:03
like specific spikes of hype. And so here for the for the long

00:55:06
run to make that happen, I think sometimes get things get better

00:55:08
after the hype cycle dies down a little bit actually because

00:55:11
there's less like fervor and like hand waving in the field,

00:55:15
you know? And the people who are genuinely

00:55:18
able to and passionate about Those tools like I think of the

00:55:20
Creator. Economy is a great example.

00:55:21
Remember the five minutes when like everybody, pretended to

00:55:23
care about the creative economy. And now money percent of those

00:55:27
people are not focused on that, but the ones that are are the

00:55:30
folks that are actually passionate actually competent in

00:55:33
that space. And I think that ultimately, the

00:55:34
tools is a result will be much better for it and I think

00:55:37
something like this and self-driving.

00:55:39
Cars are probably also parallel examples.

00:55:42
Totally cool. Well thank you both for coming

00:55:45
on. Listen to Alexis on

00:55:46
non-technical Chris Runway check Get out.

00:55:49
Alright, thank you very much for coming.

00:55:51
Of course, thank you for inviting me.

00:55:53
Thanks, Chris, goodbye. Goodbye.

00:56:06
Goodbye. Goodbye.

00:56:08
Goodbye, goodbye. Goodbye.

00:55:49
Get out. Alright, thank you very much for

00:55:50
coming. Of course, thank you for

00:55:52
inviting me. Thanks, Chris, goodbye.

00:56:06
Goodbye. Goodbye.

00:56:07
Goodbye. Goodbye, goodbye.

00:56:09
Goodbye.