Roko's Basilisk (w/Nathan Benaich)
Newcomer PodOctober 25, 202201:14:05101.75 MB

Roko's Basilisk (w/Nathan Benaich)

Nathan Benaich, the lone general partner at Air Street Capital, has long been on my radar as an artificial intelligence obsessive.

And so now that the artificial intelligence is suddenly the fixation of the venture capital world, I invited Benaich on the Dead Cat podcast to talk about generative artificial intelligence.

With my co-hosts, Tom Dotan and Katie Benner, we talked about the promise of generative AI and the ethics of a machines borrowing from the vast depths of human creativity.

I pay homage to the AI overlords, cheering for the triumph of generalized artificial intelligence while Benaich warns us that the conversation about generalized artificial intelligence is a bit of a distraction.

Benaich is the co-author of the State of AI Report that came out this month. It’s worth a read.

At the 42:40 mark, Nathan departs and Katie, Tom, and I change topics dramatically.

Tom reads from the former Mailchimp CEO’s email to the email marketing company discouraging employees from stating their pronouns at the beginning of a meeting.

The article in Platformer, which first published the email, carries the headline, Did this email cost Mailchimp's billionaire CEO his job?

Here’s an excerpt of Mailchimp’s then CEO Ben Chestnut’s message to the company:

I am noticing that whenever new employees introduce themselves in Zoom before asking their question, they’re also announcing their pronouns. This is completely unnecessary when a woman (who is clearly a woman) to tell us that her pronouns are “she/her” and a man (who is clearly a man) to tell us that his pronouns are “he/him.”

Tom, Katie, and I weigh in on the conversation around pronouns in the workplace, heavy-handed HR policies, and embarrassing CEO emails.

Give it a listen

Read the automated transcript



Get full access to Newcomer at www.newcomer.co/subscribe

00:00:05
Welcome. Hey, everybody.

00:00:13
Welcome to Dead cat. This is Eric newcomer, Tom and

00:00:16
Katie are here. I've totally lost my voice.

00:00:19
I am mustering through sickness to make sure that someone here

00:00:24
who has been writing about AI is on the podcast.

00:00:27
Because Katie and I are just here to And you are AI.

00:00:33
Standards are here. The first generative podcast, we

00:00:37
have Nathan, who's the founder Soul.

00:00:40
General partner of are Street Capital.

00:00:42
I feel like I started covering European Tech.

00:00:45
And like, we started exchanging a lot of messages.

00:00:48
And yeah. So I've been looking for an

00:00:50
excuse to do something with you and then finally I got on the AI

00:00:54
hype train and you've been thinking about AI for many years

00:00:57
and you're like, everybody Welcome to the party or

00:00:59
whatever. Yeah, glad to have you.

00:01:02
Yeah. What's it feel like now that

00:01:03
everyone's obsessed with a? I all of a sudden as have been a

00:01:06
shock or what sort of your reaction to the Mania of the

00:01:10
moment hasn't quite been a shock because sort of expecting this

00:01:14
to happen in a way. It's great that, you know, like

00:01:16
a new advanced technology domain gets attention from creasing,

00:01:20
Lee generalist investors or investors, who focus on

00:01:23
different Industries because almost then, you know, reaffirms

00:01:26
that like the technology is for real and probably has

00:01:29
applications in all sorts. Of domains.

00:01:31
Yeah, I know your portfolio. Companies can get marked up like

00:01:33
you just have to hope you got in a lot of the deals explain to me

00:01:37
what happened. Exactly because I mean if we're

00:01:39
just going to trace the chronology of the last year, in

00:01:42
Venture investing obviously 2020 21, may be the first half of

00:01:48
this year has been about crypto and you saw all these funds

00:01:52
rebranding themselves as crypto focused funds.

00:01:54
And all of these VCS that I used to talk to all the time about

00:01:57
their old Investments are, like, what the hell are you talking

00:01:59
about? Delivery to me.

00:02:01
I don't do delivery. I'm a crypto investor now, you

00:02:03
know, and I mostly do crypto and it seems like that's all been

00:02:07
forgotten. Now like that's all the memory

00:02:09
holds and like they all switch their hats and they took off

00:02:11
their Dottie th on their hat and they put on a I like explain to

00:02:16
me what has happened over the last year, the real like

00:02:19
inflection point was when you could Master this first viral

00:02:27
use case of machine learning, which is images and video like

00:02:30
before when you had text models that were working with you,

00:02:33
well, and you could generate like a script or you could talk

00:02:35
to a bot. There's something that's

00:02:37
fundamentally less, interesting, reading a script with a bot,

00:02:40
then looking at these epic images, that never existed.

00:02:43
Before that you designed with your own instructions on your

00:02:47
mobile phone. So I'd almost call it like the

00:02:49
consumerization of machine learning or at least a product

00:02:52
the output of machine learning. Which is understandable and much

00:02:57
more tangible to almost everybody who uses the internet.

00:03:01
Then the products that came before and while those tools

00:03:04
like stable diffusion and Dolly from open a I were available to

00:03:09
people in the know for a while. They really came out what like

00:03:13
late August September and that has really sort of kicked this

00:03:17
off just because a Publix very aware.

00:03:20
Yeah and I think their distributions also been quite

00:03:23
interesting. From how the Machinery Mark has

00:03:26
evolved because the central dogma is always been that

00:03:28
centralization winds, which is like to be good at machine

00:03:31
learning, you need to have all the data, all the top people,

00:03:35
all the computers and then the product and you mix it together

00:03:39
and then you get sort of, you know, great machine learning on

00:03:42
the other side and cook, those been the counterbalance to that,

00:03:44
which is just decentralize everything, give power back to

00:03:47
everybody and yeah. Over the first few months of

00:03:49
this year, the fact that you can now like arm the rebels as it

00:03:53
were are These open source communities which are basically

00:03:56
called themselves like research collectives and they either

00:03:59
gather on disk or other forms that might not even be companies

00:04:03
or corporate entities and provide them with access to

00:04:06
compute, which is what stability really did.

00:04:09
It's of liberates, the creativity of individuals who

00:04:11
couldn't participate because they weren't part of the, you

00:04:14
know, the Gap, the elites, the in a big technology company

00:04:17
Elite. And that's actually led to like

00:04:19
massive open sourcing which then busts up the centralization

00:04:22
hypothesis and shows you that there.

00:04:24
Like another path to building and distributing machine

00:04:27
learning based products thing. That's what I like a lot of

00:04:29
folks are getting excited about to how important was Dolly as a

00:04:34
product and I guess for our listeners that haven't used it

00:04:37
out there mean. This is the tool that the Eric

00:04:39
was mentioning like open a.i. released.

00:04:41
And it allows you through fairly direct instructions.

00:04:44
You've got to be direct because it's not always that smart.

00:04:47
You've got to basically say like in the style of Monet or in I

00:04:50
feel like the key is to like yeah give it a realistic, right?

00:04:54
Exact right, right. Clip art.

00:04:56
But basically, you can give instructions to, you know, it

00:04:59
says some software and it will generate with less than a

00:05:01
minute, some fairly creative depiction of what you drew and

00:05:06
it's like very easy to use and is like a very clear product of

00:05:11
what a I can do. So like how important do you

00:05:14
think just that being released even on a beta level and then

00:05:17
more openly played into the excitement from investors.

00:05:21
That there's like a whole opportunity here.

00:05:23
I think the community Was like pretty Blown Away with the

00:05:25
results of Dalian specially like Delhi to but then they get back

00:05:29
to this question of. Can you build businesses on the

00:05:31
apis, on the clothes, the apis of the large companies and know

00:05:35
there's a few that are built on gbt.

00:05:37
So I think that that excitement got even greater ones like these

00:05:41
models became reproduced in the open source World by folks like

00:05:44
stability a looser, Etc. So, I think it's really just as

00:05:49
this notion of like, busting up centralized entities, and

00:05:51
providing tools to everybody else.

00:05:54
I think they've also landed really well, just because we

00:05:56
kind of live in this like a tick tock generation of short form

00:05:59
images, video, and images, and Instagram, and the outputs of

00:06:03
these models are like perfect fodder for those platforms.

00:06:07
Well, what's interesting to me about Dolly and this lady of

00:06:10
generative images is that it's something that you can very

00:06:13
easily explain to someone like I think it passes the dinner table

00:06:17
test that crypto certainly had a very difficult time doing.

00:06:21
You know like prove to us has a hard time passing the Tests

00:06:25
showed that grandma just trust me.

00:06:28
Put all your money in this right?

00:06:30
Different dinner table conversation especially with

00:06:32
older people. But yeah, you know it's like

00:06:35
everyone, I'm assuming all of our listeners spent last

00:06:37
Thanksgiving trying to explain to their uncle's or even worse.

00:06:40
Having their uncle's explain to them the value of crypto and

00:06:44
like what it can do and then whereas with this, it's like oh

00:06:47
why don't you think of some words and let me put it into a

00:06:49
prompt and I can enter it and it'll draw an image.

00:06:52
And at the very least you're like, oh wow, that is

00:06:53
impressive. Technology that I can't and out.

00:06:55
We need you saying you can grasp and I mean, I know it seems like

00:06:59
really simplistic to put it in these terms.

00:07:01
But is that how investors felt as well?

00:07:03
It's like, oh, I get this, this is cool.

00:07:06
I think there's probably a bunch of like I get this.

00:07:08
And this looks like consumer technology that have been used

00:07:10
to investing for a very long time.

00:07:12
I think the other thing too, is some investors realizing that

00:07:15
they don't have like a bet in a. I like, what is my big bold

00:07:19
debt? And when I like the proverbial

00:07:21
investors, sees the opening eyes were half a ton of money.

00:07:24
Do you mind doing some great things that there are some new

00:07:27
offshoot Labs that we provide in our state of the air report that

00:07:29
are raising tons of money? You know build a billion people

00:07:32
that they're kind of all these adjacencies that you can apply

00:07:36
machine learning to sort of get to just drum themselves up into

00:07:40
this fomo. Can you break down just like

00:07:43
generative AI versus with everything else?

00:07:45
Or I feel like all of a sudden, we're talking about generative

00:07:48
AI. Yeah.

00:07:49
But textbook, definition you basically have like to kind of

00:07:52
categories of machine learning. You have like, It's called

00:07:54
discriminative machine learning which is basically like given a

00:07:58
data set. How can I draw a boundary

00:08:00
between two categories in that data set?

00:08:03
So to decide is just like images and animals, whereas the

00:08:06
boundary, that separates one species from another and then

00:08:10
you train this model on this data set.

00:08:12
And then it's tasks. When it sees new data is to just

00:08:15
classify, the species is present in the image.

00:08:17
And that's discriminative by contrast like generative is

00:08:20
basically where the model is trying to learn like the

00:08:23
statistics The probability distribution of a data set like

00:08:27
learned something intrinsic about it, such that you can ask

00:08:30
the model to basically generate to synthesize and you example

00:08:34
that fits that data distribution.

00:08:37
And so like at a high-level busy, what we've done is created

00:08:40
models that are like able to sort of learn like increasingly

00:08:44
complex datasets in this case, like the entire internet or the

00:08:48
entirety of like Flickr or whatever in pairs of, you know,

00:08:52
text descriptions to Image representations such that when

00:08:56
you ask it to generate, like some arbitrary seen like, with

00:08:59
some very specific prawns, like it's seen different combinations

00:09:03
of these things before and can smush them together in a way

00:09:06
that like looks nice. So you could, you could probably

00:09:09
argue that like a lot of the generative AI that like we're

00:09:11
talking about today which is images and video and text sort

00:09:15
of like a rebranding of like creative AI that I used to see I

00:09:18
five years ago or so. When especially this technology

00:09:21
called Gans regenerative adversarial Networks.

00:09:24
Hmm which were like you know very hot and IP a couple of

00:09:27
years ago we basically get like one model that generates an

00:09:31
image and then another model that says like is it good or is

00:09:33
it bad when you sort of put them together against one?

00:09:36
Another it early and then at some point like the image

00:09:39
generated comes good because it can fool the network that saying

00:09:42
is a good or bad. So it's sort of like not new and

00:09:44
is like fundamentally part of like textbook machine learning,

00:09:48
you know, I feel like I'm going to ask the most obvious question

00:09:50
here, so bear with me. But how does Human creativity

00:09:55
work alongside generative AI. It is a compete with generative

00:10:00
AI you know if we can do so much based on extraordinary Works

00:10:06
already, created in the past that we all love and admire and

00:10:10
we imagine that a I could create really pleasing interesting

00:10:15
thought-provoking work based on that.

00:10:18
Where does human creativity sit alongside that I think it?

00:10:22
Awaits you in creativity will just A guide almost.

00:10:26
And then these generative models would just help you explore our

00:10:29
like this search space of what you could possibly make.

00:10:32
And I think I got a high level. It's kind of beautiful with

00:10:34
these machine learning models that, you know, synthesize more

00:10:37
data than any human could ever do in their entire career.

00:10:40
If they tried to become expert at something, is that kind of

00:10:43
spread over like Humanity's history, we can sort of get to

00:10:46
like more local Optimum solutions to things.

00:10:50
If you remember the alpha, go to the case study.

00:10:53
It's like if You train the system on, like, as many

00:10:56
simulations, is physically possible in the space of time,

00:10:58
than you're probably get some gameplay.

00:11:01
That's better than what we've seen before.

00:11:03
And so it turns out that like all of human expertise passed on

00:11:06
from one generation to another yields a local Optimum, that's

00:11:09
not the best that exists. And so I think, you know, in a

00:11:12
way that is generative systems are sort of guides to get us two

00:11:16
more Optimal Solutions. And this applies to making

00:11:19
pictures prettier and prettier but also applies to making more

00:11:23
potent. Drug molecules and

00:11:25
pharmaceuticals is funny to hear this answer in the domain of

00:11:29
creativity though. You're like they're going to be

00:11:31
they're going to rank in a higher percentile of creativity

00:11:34
than we have. And it's like very subjected to

00:11:37
because it takes things like Picasso and turns him into sort

00:11:41
of raw material. It turns him into a commodity,

00:11:44
right? Yeah, in order to generate more

00:11:48
images. Yeah.

00:11:49
I wanted to build on sort of Katies question, which is just

00:11:52
also the sort of late in like plagiarism stuff or just like

00:11:56
the machines taking advantage of like past human, creativity to

00:12:00
become smarter. There's sort of the

00:12:02
philosophical question in that, where I think some artists feel

00:12:06
like, well, they're taking my data to build these future

00:12:10
drawings and really, I'm not getting the cut of that.

00:12:13
You don't know. It needed me.

00:12:15
So there's partially. It's just like philosophical,

00:12:17
but I am curious also like as an investor.

00:12:20
Is there any like the eagle risk here?

00:12:22
Someone was talking to me. You know, like this generative

00:12:25
work could be done in like, video games or something.

00:12:28
And you can imagine, like, the set of video games as much

00:12:31
smaller, and it could be much more obvious.

00:12:34
If you're sort of building sort of an algorithm off of existing,

00:12:38
sort of, You Know, video game IP, I don't know.

00:12:41
How do you think about sort of the intellectual property of

00:12:44
what goes into these systems? Yeah, I think it depends on

00:12:48
that. The scale of the data set that

00:12:50
the models learning from because you could probably argue if the

00:12:53
model is sucking up the entirety of the internet and what is one

00:12:56
incremental like blog? Authors blog going to help this

00:12:59
model and how can they justify that unless perhaps like there's

00:13:04
so iconic in their style like in the artistic space with Picasso

00:13:07
or it could be the grounds for the largest class action lawsuit

00:13:10
of all time. Yeah so that my words were

00:13:14
stolen in front of the large length model and I believe all 7

00:13:16
billion people on earth, have the claim to that.

00:13:19
I love it. When big law firms together with

00:13:22
technology. Yes.

00:13:24
Wait a second. So I think that's probably one

00:13:26
of the only ways that like one can legislate against these

00:13:30
systems, which is like it. The entirety of like a

00:13:32
significant pool of people to legislate, or two or kapha

00:13:36
lawsuits, because the individual consumers, not going to have a

00:13:40
say, I'd love to see that advertisement on TV.

00:13:42
Have you or your words been used as part of a large language

00:13:45
model that you feel is impeding on your personal rights and

00:13:47
creativity. You may be entitled to a

00:13:50
settlement. Hey, we have a prediction in the

00:13:52
state of the air report that will I have some fun like,

00:13:55
content lawsuit, that will. I think at some point just give

00:13:58
rise to I can use for licensing agreement, which is like a, for

00:14:02
example, like I'm Reddit. And in order to train on the

00:14:05
entirety of my Corpus of conversations, Etc, then you

00:14:08
have to abide by blah, blah, blah fascinating.

00:14:11
So, there actually, could theoretically be some sort of

00:14:13
Licensing elements of this that has to be taken to account for

00:14:16
these large language models. Yeah, I mean, I think if you're

00:14:19
a big content owner, the best part is like your future

00:14:21
monetization stream and that's a wave.

00:14:24
You to participate in this teacher, that's pretty

00:14:26
inevitable. I think what's open source like

00:14:28
gets their hands on something? It's kind of an inevitable

00:14:30
direction of travel. So then the question is, like,

00:14:32
are you Wikipedia, are you Encarta?

00:14:35
So, as one of the reasons why there's such a rush to go in

00:14:38
early, you know, is one reason that investors sort of see this

00:14:43
on the horizon which would slow growth.

00:14:45
So get in. Now while there are still a lot

00:14:47
of room for very quick movement, I think yes or no, but I think

00:14:50
if sure early then you probably might suffer from like being the

00:14:53
tallest pop. P and sort of being the target,

00:14:55
you know? And then once like to label and

00:14:58
has been cast, then the fast followers can move in, and I

00:15:02
think that's been proven captures gone Spotify.

00:15:05
Is he like, you know? Yeah, exactly.

00:15:07
I think that kind of dynamic can happen, but it is also true that

00:15:10
in open source and just in communities, in general, like

00:15:13
once you get momentum it's hard to stop it.

00:15:16
Unless you really screw things up or something new pops up.

00:15:19
And so that it's tough to think about what are these competitive

00:15:22
long-term modes that you can. Apply to sort of keep your pole

00:15:26
position. Did you follow the whole

00:15:28
conflict? This week between stability, Ai

00:15:31
and Runway. Yes, but yesterday, actually,

00:15:34
and yes, I tweeted, I was like open source AI for all like,

00:15:38
what can you recap? It here is very obscure.

00:15:41
It's happening on hugging face, which is like The Message Board

00:15:45
of the sort of, the AI world. So I always deep in, but

00:15:49
basically these two companies both back by CO2, you know, the

00:15:53
huge investor Better stability. AI is basically the company that

00:15:57
put out stable diffusion with some other researchers and then

00:16:01
this company Runway I think they're co-founder participated

00:16:06
in the research with stable. Diffusion, they put out like an

00:16:09
update to stable diffusion without stability.

00:16:12
A is permission and then stability.

00:16:14
A I basically said this was a violation of their IP and

00:16:18
threatened them and then eventually somehow Runway got

00:16:21
them to back down, but it's like clear.

00:16:24
These sort of Open Source projects are not as open source.

00:16:29
Maybe as represented different people want to own them and

00:16:32
people are raising it billion dollar valuations off

00:16:36
Technologies. Where it's not clear who

00:16:38
controls them, Nathan died gloss that right?

00:16:41
Or what am I missing? Or what do you think's

00:16:42
interesting about the whole incident?

00:16:45
I think you got it quite right. The other interesting thing is

00:16:49
like the core technology and paper and busy research that

00:16:52
underpins like stability came. From academic environments that,

00:16:56
you know, were sort of enabled by large compute infrastructure.

00:17:00
And so that's sort of also been like increasingly ignored is

00:17:03
perhaps like this ecosystem to become financialized by applying

00:17:07
large valuations to companies. So I was surprised, it's just

00:17:12
rise of the keyword of like stability IP because I thought

00:17:16
in all the marketing but there is no IP because it's all open

00:17:19
source by default. It's basically open source and

00:17:22
you understand this better than I do.

00:17:23
It's Source, but then stability. I basically spent a ton to run

00:17:29
simulations of it, or whatever, or where to train it, right?

00:17:32
And that costs a lot of money. Yeah.

00:17:34
But then, even the training work helps everybody because that

00:17:38
system is just out there or yeah, stability.

00:17:41
AI isn't able to say we paid to train it.

00:17:43
We only get the better version or yeah.

00:17:45
I mean, so like this, this sort of like the model code and then

00:17:49
you train it and, you know, you can train it on whatever say

00:17:52
like, you have a data set, I have a Dataset will sort of get

00:17:55
a different model because we've trained it on a different data

00:17:57
set and the difference is just expressed by what's called

00:18:00
weights. So basically this model has like

00:18:03
tons and tons of knobs and then you need to tune like billions

00:18:07
of knobs. And so if you train it on

00:18:09
different, datasets will get slightly different, knob

00:18:11
configurations. And then you can like list these

00:18:14
knob configurations on plugging face and then you can download

00:18:17
them and then and then apply them to your model without like

00:18:20
retraining your own model and then you've got the same model

00:18:23
basically. So these are what's called like,

00:18:25
model checkpoints weights that kind of thing.

00:18:28
But yeah, I mean, these architectures are largely in the

00:18:31
public domain and then the data set that was trained on is in

00:18:34
the public domain and says, Master dies.

00:18:36
I called lie on. So, unless I think a company

00:18:40
takes the open source model and then trains it on their own

00:18:43
Corpus of like images or video, and then someone steals that

00:18:49
that's like theft of Ip. But then if they just published

00:18:52
the model, back on the Internet with the new ways and then say

00:18:56
can be used for both commercial and research purposes than its

00:18:59
Open Access than its connection. If theme, that's like Tom and

00:19:03
Katie and I have like, discussed over the years and it's always

00:19:06
interesting. Technology is like a sense of

00:19:09
like fatalism in Tech where it's like if something's happening

00:19:14
Silicon Valley doesn't always want to have like some huge

00:19:16
ethical debate about it because there's just sort of a realism

00:19:19
that like what the cats out of the barn.

00:19:22
I mean, I think we saw with like, Opening.

00:19:25
I write like dolly was slower to be open access than some of

00:19:30
these others and then like stable diffusion basically

00:19:33
jumped the gate and then dollies like fuck it will be out there

00:19:37
too. I mean do you think like this is

00:19:41
like a controlled enough situation where anybody can be

00:19:44
thoughtful about, like how technology should develop or do

00:19:48
you think this is just sort of like a Mad Dash where it feels

00:19:52
like this has happened? Inning.

00:19:54
I should be the one to monetize it before somebody else.

00:19:58
Well, I don't think the primary motivation for stability is

00:20:01
monetizing. I think it's really Distributing

00:20:03
like systems to anybody who wants to run them and who can

00:20:07
benefit from them. But I do think it has some

00:20:10
implications over, like the kind of alignment and safety and

00:20:13
guard wills and things like this around these systems.

00:20:15
And I mean the canonical like example was open EI and big tech

00:20:20
companies that have their own views as to what people could

00:20:23
use these tools. As for in the form of filtering,

00:20:25
certain prompts that Woodgate, the model from creating certain

00:20:29
things. There were deemed to be like

00:20:31
unsavory. And then, the alternative is

00:20:33
like stabilities. Anybody can go do this, but in a

00:20:35
way, like, who should decide who gets worked, and how in a way

00:20:40
you could say that, you know, the entirety of humanity that's

00:20:43
going to work on these open source models could potentially

00:20:46
get to a better place than a few people, in a single company.

00:20:51
That's pretty much the experiment that's getting run at

00:20:53
the moment because as you say, like, once one company goes,

00:20:57
open source, it's for game theoretical everybody else has

00:21:00
to if you want to be relevant, hence, the like Wikipedia versus

00:21:03
Encarta analogy, that I think is quite topical here.

00:21:07
You know whether either of them end up being good.

00:21:09
Let lucrative companies is like another question, but this is

00:21:12
only like the first-order effect.

00:21:14
You have all the second order effects, which is what other

00:21:16
fields are going to benefit from these Innovations.

00:21:19
And that's where I mean spending like a lot of time.

00:21:21
And especially as these kinds of models touch like problems in

00:21:24
biology and chemistry, and physics and Drug Discovery and

00:21:27
things like that. And that sort of occurring of

00:21:29
the in the shadows because it's slightly Technical and goes back

00:21:32
to like the non-viral consumer e, these cases of machine

00:21:34
learning, but it's very real. We're all writers.

00:21:38
How terrified you think we should be that generative?

00:21:42
AI will successfully replace writing like some PCS say, oh,

00:21:47
writings even easier than images right?

00:21:51
Hard. Like I mean, somebody wrote like

00:21:55
one Masters and one PhD thesis and I try and write a newsletter

00:21:58
as good as Eric's, but like this is hard.

00:22:01
So I can use classic like perrito.

00:22:02
Like, I think you can do, you know, 80% of the job.

00:22:05
And then the question is, like, how easy is it?

00:22:07
As a user experience to solve the last 20?

00:22:10
And I think in many of the writing assistance that I've

00:22:12
tried, it's like, you know, generates text, but then you

00:22:14
kind of get halfway through you like this garbage.

00:22:16
Like, this is not good enough. One of the newsletter writers

00:22:20
did Ali-A, I driven tweet storm, that went viral and he said it

00:22:24
was like his best and I guess my throwback on this writing

00:22:28
question is almost like, My worry is about the audience.

00:22:32
Like, I think, like, you me, like we're writing the 80 to 100

00:22:35
is very different. It's like, oh, this looks like

00:22:38
good writing. Yeah, it is incoherent.

00:22:40
The people you're referencing aren't real or like whatever,

00:22:43
you know, the actual Logic, the key.

00:22:45
The real key part is not there but the Aesthetics of it, that

00:22:49
it looks like something you would say.

00:22:50
It feels like Like has takes like it feels like a clean

00:22:54
solution to a problem that's there.

00:22:56
And so if the audience is dumb enough, you know, then you could

00:23:00
make money off of it, you could build a large audience and I

00:23:03
feel like that's sort of terrifying that like, part of

00:23:06
what's been protecting. The world is just that the

00:23:08
people doing the writing want to believe that.

00:23:11
It's like coherent. But if you just like, unleash a

00:23:15
sort of generative AI, it's like, well, can people tell the

00:23:18
difference or not? I mean, do you think that's too

00:23:20
cynical? Or do you see my sort of fear

00:23:22
there that the humans need to be good at like they need to care

00:23:26
enough that it makes sense. Yeah I mean I'm kind of from a

00:23:30
positive like if I can get like a that's a low bar to be more

00:23:34
positive than invite guests, system that can deal.

00:23:37
Take my like English written newsletter and writing in like

00:23:40
God knows how many different languages or create different

00:23:42
formats or create hot takes that are shareable on different

00:23:46
platforms or can just speak it in the same way than I've spoken

00:23:49
and I've tried to do that manually.

00:23:51
It's a pain. Like I'm happy being like 10

00:23:53
bucks a month for that. Like and I wouldn't be really

00:23:55
not concerned with take my job right at the belt, like, you

00:23:58
know, amplifying the things that I'm already doing but like

00:24:01
moving some of the like real work, whether it ends up like

00:24:04
entirely replacing me. I think, obviously, that's just

00:24:06
like, very hard to tell and by that point, maybe I'll have

00:24:09
found something else. I want to do.

00:24:11
So, and language is so filled with Nuance, in terms of word

00:24:15
choice translation. It's interesting.

00:24:16
You brought that up translation is actually in some ways.

00:24:18
Extremely hard, because straight trans Ation often does not

00:24:22
capture meaning at all. So it's a, you know, language is

00:24:27
very tricky. Imagine a straight translation

00:24:29
of The Iliad. I mean, that wouldn't really

00:24:30
work, right? Yeah, may I, somebody was just

00:24:33
telling me yesterday that AI is very appealing in the sort of

00:24:36
cross cultural context, because you could imagine, like, novels

00:24:41
where right now, the novel's talks about like New York city

00:24:43
where the authors from, but if you want to sell to a

00:24:46
mass-market audience, like you could say, oh this machine's

00:24:49
going to figure out, This readers and like Beijing or

00:24:52
whatever, and we'll replace it with like their favorite like a

00:24:55
local shop and even if it's like you know imperfect and doesn't

00:24:59
get the language, right? It's still like better than

00:25:02
today where there's no effort made to sort of Pander to the

00:25:06
reader. So I mean, we could enter this

00:25:08
world where I like stuff is really sort of catered to that

00:25:13
literature is, would it be better to read like Lord of the

00:25:16
Rings and have it set in Washington d.c., right?

00:25:19
Yeah, like Let's replace, Middle-earth, you know, I would

00:25:24
go with the Capitol Hill and you will toss the ring into the

00:25:27
fires of the Rotunda don't want to imagine a world.

00:25:30
I've never seen before. That's not the fucking point of

00:25:33
fiction. Yes, it is actually the fucking

00:25:34
point of picture. Well, you know, it's interesting

00:25:36
though is that like that kind of piggybacks onto this idea of all

00:25:40
content being catered to our personal tastes?

00:25:43
And this sort of social media-driven idea of, you know,

00:25:46
algorithmic driven consumption. And like, why shouldn't you know

00:25:50
the next? I don't even know what popular

00:25:52
book series are out there these days but you know like the next

00:25:54
Nobel winning book, be iterated towards the different audiences

00:25:58
because that's the way people consume everything else.

00:26:01
You know, there's no advantage to Central entertainment,

00:26:03
centralized experiences, something that you have to, you

00:26:06
know, relate to other people's point of view that's done with

00:26:09
that's over. That's the old one.

00:26:11
However, I would like to see a Hunger Games La version.

00:26:14
Sure. Yeah, I mean, in a way it's like

00:26:16
Mass personalization everybody gets their own version but in a

00:26:20
way it's kind of sad because that it's like loss of opinions.

00:26:23
It's also a loss of a cohesive experience, it's lost.

00:26:27
I mean like women this is ages ago and I didn't read these

00:26:30
books until much later because Eric made me but Harry Potter is

00:26:33
a great example of a series of books, that created a creative

00:26:39
imaginative experience that people across cultures age

00:26:44
groups, socio-economic groups, races and genders could all

00:26:49
encounter together. And do we not want to have

00:26:52
things that bring people together anymore?

00:26:54
Like any good Millennial. I militantly insisted the Katie

00:26:58
read. It was really intense.

00:27:01
It was a lot of pressure. It was a long queue monthly

00:27:03
month-long campaign. I like how sad for you.

00:27:06
And my request is an exchange for house-sitting.

00:27:09
Yes, a you read her. I left you.

00:27:11
Like my own copies or switch was like so intense because they're,

00:27:17
like, Mormons out there. There were like, chill out a

00:27:19
little bit more busty. You don't have to leave it

00:27:22
everywhere. Yeah.

00:27:25
I don't know. It's funny.

00:27:26
I mean you want me to have that experience because you wanted

00:27:30
right? And so we could be friends and

00:27:32
understand me, right? Like if we don't have that shit

00:27:36
anymore, that we share, that's not of our own personal

00:27:38
preference like what what do we have?

00:27:40
But you know where I could see that concept being actually very

00:27:43
appealing to Publishers is the idea of stuff being, you know,

00:27:46
of its time and like not aging very well.

00:27:49
I don't mean like a thematic lie but I mean like we're Choice or,

00:27:52
you know, the stuff in between Huckleberry, Finn example.

00:27:55
Yeah, Choice. Yeah.

00:27:59
I mean, yeah. I don't even know what like the

00:28:01
a I woke version of Huckleberry Finn would call, you know.

00:28:04
Mr. Jim but like yeah it's all those things.

00:28:07
Anyway, I don't want to spend time on that but my point is, I

00:28:09
can see actually that idea being like, well why can't a book?

00:28:12
Be like a dynamic thing. And over time, a I can identify

00:28:16
what are the problematic themes and words in the book and update

00:28:20
in a way that you know, Doesn't offend people in a way that it

00:28:23
might have at the time that it was written.

00:28:24
I mean, it seems so starts in the video game environment of,

00:28:27
like programming. These non-player characters that

00:28:30
have certain behaviors respond in certain ways and therefore

00:28:33
give like a unique experience to the game player.

00:28:36
And like the way they're trained is quite cool to where you can

00:28:39
like import an example of a conversation or a script and

00:28:43
then sort of like, tune some knobs based on personality

00:28:46
traits, say, you know, I don't know they behave like Zeus or

00:28:49
something and then the agent knows that.

00:28:51
Because it's like read all of Wikipedia and stuff like this.

00:28:54
So it was pretty wild to see that.

00:28:56
And so perhaps it's like more in these Virtual Worlds with this

00:28:58
will happen. And in the beginning of our

00:29:00
conversation, I mean you were sort of nodding to the fact

00:29:02
that, you know, a lot of the way I work is very open source and

00:29:07
like sort of not totally like Financial driven.

00:29:10
I don't know. But is there like a clear, like,

00:29:14
AI researcher sort of like ideology or like, what are the

00:29:17
sort of like camps in terms of the culture?

00:29:20
That's Emerging in this space. As you see it or is it just too

00:29:24
big to have something like crypto was unique?

00:29:26
Because I had the financialization to sort of get

00:29:29
everyone in line and sort of creature culture.

00:29:32
Does a I have something like that same sort of shared

00:29:36
culture, I think then what are the new demarcations I've seen

00:29:40
as Iran safety and Alignment. Like those a very, very, very

00:29:43
small number of people that are working on this topic of like,

00:29:47
if we invent a GI, like how do we make sure that it aligns

00:29:50
broadly with Preferences. This is based on the concept

00:29:54
that like any prior species that was smarter than the one that

00:29:57
came before it like generally made a pretty bad experience for

00:30:01
the species that was there before.

00:30:03
And so like there's a construction of people that

00:30:06
don't want to work on capabilities anymore, which is

00:30:09
broadly, like making ahead better.

00:30:11
And they only want to work on making a, I say 40 more aligned,

00:30:14
then you're supportive of that skeptical of that or do you have

00:30:17
a personal point of view? Yeah.

00:30:19
I think I'm generally supportive of that.

00:30:21
That's like anthropic, right? Is a big company in that space.

00:30:24
Yeah. And throughout the small one in

00:30:26
London called conjecture this one called Redwood research.

00:30:29
There's a few people that open the eye, that a couple dozen

00:30:32
people, how do they make their money?

00:30:34
Is it just like, tithes from the actually profitable companies

00:30:37
to, like, feel good about themselves for the moment?

00:30:40
They don't, you have to the Moon, they do.

00:30:41
They've just been through a Venn diagram overlap between safety

00:30:45
and alignment and effective altruism.

00:30:48
And so, we've seen for example, like Dustin moskovitz, Events

00:30:51
that we philanthropy and send back and treated FTX who funded

00:30:55
a lot of these projects. And as BF did the massive round

00:30:59
anthropic. So, none of these companies are

00:31:02
revenue-generating at. I don't know if they have

00:31:04
aspirations to be, but they certainly want to create better

00:31:07
tools for alignment. And you're very specific point

00:31:10
that, you know, more sophisticated or more advanced

00:31:14
species, you know, sort of crush, the one below that, you

00:31:18
know, I study philosophy in college and I'm a big lie.

00:31:21
Like you know, bite the bullet. Type person on moral intuitions

00:31:25
and there's a certain type of argument that if you're like a

00:31:27
die-hard utilitarian and you find out like that this new

00:31:31
super a I like experiences more utility than human beings can

00:31:37
and gets like, more joy, more, whatever the utility calculus

00:31:40
is, they get like more of it than you should sort of route

00:31:43
for the AI to Wipeout human beings.

00:31:46
Like, if resources could more efficiently, go to the AI, which

00:31:49
gets better, Use that. It makes sense for it to go

00:31:53
there, which I think is sort of a hysterical.

00:31:54
Like, I'm gonna bite the bullet, all the way on this one and Say,

00:31:57
Goodbye human beings. That sounds like an argument and

00:32:00
a, I would make error, sorry. I don't know.

00:32:04
I'm also very worried about rocas basilisk.

00:32:07
So I guess this would be a very, you know, a that also sounds

00:32:10
like something we've talked about that.

00:32:21
This Overlord a I already that sort of exists like across time.

00:32:26
And so to save yourself, you need to be working towards its

00:32:29
existence because anyone who doesn't will be like terribly

00:32:34
punished and so, yeah, you should.

00:32:36
This is, this is the plot of Verizon 0.5.

00:32:41
I know how that one Ends music. By the way, I imagine that's a

00:32:45
big use case for this technology, right?

00:32:47
Yeah. Well, music is one that was

00:32:49
again. Like tried several?

00:32:51
Years ago. And that maybe now is that an

00:32:53
inflection point to. So we had a business a couple

00:32:56
years ago called Juke deck which eventually sold to by dense but

00:33:00
they were like one of the oh geez that machines creating

00:33:03
music. And now it's probably a ton

00:33:06
better but I'm kind of like excited about maybe like the

00:33:09
more esoteric applications that are not super obvious that could

00:33:13
have a business called intense eye, which does help and safety.

00:33:16
Basically, like, protecting individuals and Manufacturing,

00:33:19
industrial environments, who Ooh, unfortunately get injured

00:33:22
because those environments are dangerous or, you know,

00:33:25
accidents happen and this is great documentary on Netflix,

00:33:29
which is like a perfect primer for why?

00:33:31
This is an issue called American Factory?

00:33:33
Sure the Obama Doctrine. Yeah, yeah, exactly.

00:33:37
So this is like a solution in a way to some of the problems that

00:33:39
manifest, their where you're trying to like have good health

00:33:43
and safety practices, but it's just hard to do that walking

00:33:47
around with a clipboard trying to instruct people who don't

00:33:50
wear the right. Active equipment things.

00:33:52
So they use computer vision to apply this.

00:33:54
It's are like a cyber security solution for the real world and

00:33:57
like we've already seen that, like, some large companies that

00:34:00
implement this immediately, you know, after like a week or two

00:34:02
weeks, see huge reduction, and alerts and dangerous, behaviors

00:34:07
telling, that's like one like I didn't really know anything

00:34:10
about before I encountered this company and watch this

00:34:12
documentary and realize like shit.

00:34:14
This is massive with like big implications and makes your

00:34:18
system pretty cool machine learning but it's like just like

00:34:21
Number one, or number two priority for a certain category

00:34:23
customer was really excited about these sort of domains

00:34:26
rather than like, the, what's in like the glitzy.

00:34:28
Obvious Limelight, that every VC is going to kind of vibe with

00:34:33
see that feels like, it's even more aligned with.

00:34:35
Are its argument that the AI should wipe us all out because

00:34:38
if we as humans can't even protect ourselves without using

00:34:41
an AI, you know, it's like protect ourselves from each

00:34:44
other, it would seem like there's no hope, right?

00:34:46
I mean I would argue that some people think that that kind of

00:34:49
use of AI is Wiping humans out. I mean we have seen some of

00:34:53
these things especially Industries like long-haul

00:34:56
Trucking where more and more of the decisions that one can make

00:35:00
are being given over to a machine and a person is sort of

00:35:04
peripheral to the process and it's not necessarily.

00:35:07
Well, it is physically reducing things like accidents.

00:35:10
What do you think about what's happening to the human beings

00:35:13
involved? You could argue that there are

00:35:15
other negative consequences that perhaps haven't been

00:35:17
anticipated. So I don't know, Tommy, well,

00:35:20
maybe we'll have it both. Ways humans will be wiped out

00:35:23
either way, right? Right.

00:35:25
The richest destitute for the jobs that were taken away from

00:35:27
us by a eyes or, you know, we just don't use the aib, just all

00:35:31
died of massive injuries in our factories.

00:35:34
Yeah, yeah. These are going to guys are.

00:35:36
Come on, you're optimistic about it, right?

00:35:39
I mean I think it's pretty amazing.

00:35:40
I think this is gonna be the biggest like productivity gain

00:35:43
for human beings and like a long time I think it's going to be a

00:35:47
massive Revolution. I'm like yeah, it's true.

00:35:50
So true. Believer in like AI changing

00:35:53
human existence far more than crypto and like very happy to

00:35:56
see Silicon Valley. Yes, moving back this.

00:35:59
Yeah, I agree with. I mean, I think that you're

00:36:01
totally right by the productivity gain, I just am not

00:36:03
sure their productivity gain is the Baseline measurement for

00:36:08
whether or not humanity is getting better or worse.

00:36:10
Well, government needs to do something to say, okay.

00:36:13
We've made these productivity gains therefore, you know, we're

00:36:16
not going to just keep grinding every human being out or it

00:36:19
requires Paul. Let's see maybe to cash in some

00:36:23
of the benefits for people instead of just me.

00:36:26
I think, the, my Baseline is like what kinds of problems like

00:36:30
weren't addressable before that now become addressable because

00:36:33
we have this new technology, right?

00:36:35
So, some of that might drive productivity ends, some of them

00:36:39
might not, but I think that's like the coolest unlock.

00:36:42
It's like what can you do if a solution requires, more than

00:36:46
like a web app in a database, like a nice UI or something?

00:36:51
Peter's dominate Us in chess, they can dominate Us in,

00:36:55
presumably drug Discovery or whatever.

00:36:57
Once we figure that, I mean, that was what I took you to be

00:36:59
saying earlier, it's like, yeah, we're not the best in the world

00:37:02
at games that we've been playing for much of sort of

00:37:06
sophisticated Humanity. It's very likely, we're not

00:37:09
going to be the best at other things.

00:37:11
We can do especially of games that we need better tools to

00:37:14
understand. Right.

00:37:15
Right. Well, it seems like, I mean, if

00:37:17
I could delineate the 80/20 issue here, you know, 80 is

00:37:21
being like we've developed an AI that can beat us all in

00:37:23
Jeopardy, but like, the last 20% is, like, developing an AI that

00:37:26
can host Jeopardy. And that seems like it's the

00:37:31
hardest thing to do. I mean, we hardly can host it

00:37:33
ourselves. Well, yeah, we get to set the

00:37:34
expert. I mean, that's why people think,

00:37:37
you know, if anything a I could be good for sort of emotive

00:37:42
interpersonal tight or professions.

00:37:45
Yeah, because humans get to set the score on that and say we

00:37:49
actually we prefer I mean like yeah, it's true.

00:37:52
Like I'm a friend right now who's in the hospital with

00:37:55
cancer and I think she'd rather have the bad news delivered to

00:37:59
her by a human being. He held her hand and be

00:38:03
emotionally connected to her in a real way rather than a I yeah.

00:38:07
I think I think it's going to be hard to replace that.

00:38:11
Yeah, this is true but I think I'd even in that example, we

00:38:13
have a company that is not part of a bigger drug Discovery,

00:38:17
business called xen cheetah. But like in every case like the

00:38:20
doctor is trying. To make an assessment as to,

00:38:22
which therapeutic strategy is the best for this patient.

00:38:25
And that's really, it's really hard choice to make.

00:38:28
And at the moment, like, what they've been doing is at best

00:38:31
sort of like, doing a biopsy and sequencing and seeing what is

00:38:34
the gene that might be causing the cancer?

00:38:36
And then just taking a drug that, you know, in theory fights

00:38:40
that specific mutation. But this company that we

00:38:42
invested in, they actually take that same biopsy and base if you

00:38:45
run like a clinical trial in the dish.

00:38:48
By having that biopsy and the presence of like one of hundreds

00:38:52
of different cancer agents and you can functionally measure,

00:38:55
whether this drug is, you know, fighting the cancer or not.

00:38:59
And they've actually proven that like you see, statistically

00:39:02
higher survival rates because you're functionally assessing

00:39:06
cancer, drug performance against the patient's.

00:39:08
Tumor versus just in a very reductionist way like doing

00:39:12
mutation and Drug matching and that step I have.

00:39:14
No doubt is going to make huge advances.

00:39:16
And I'm just saying that like, Somebody has to hold her hand

00:39:18
and say, you're going to die. I think most people would rather

00:39:21
have that news delivered by a human being.

00:39:23
Yeah and also like we saw a lot of this to during the pandemic.

00:39:27
When people had to do things like give birth by themselves,

00:39:31
just alone messages coming into their phone.

00:39:34
For some reason didn't really feel that comforting.

00:39:36
They didn't really like that was an optimal experience.

00:39:39
They still wanted a human being one of a new standing next to

00:39:43
them while they did this but they just couldn't have it.

00:39:45
Yeah, I think for the most intimate and personal of

00:39:48
interactions in a, I should never replace it.

00:39:50
Like, it's not something that unless you're absolutely lonely.

00:39:53
And, you know, that's a whole other thing to people chat, you

00:39:56
having conversations. I mean, that's just the essence.

00:39:58
The essence of being human is being lonely, right?

00:40:00
Right. But I mean, that would probably

00:40:02
like the last, you know, quarter of the last, you know, moment of

00:40:05
humanity is like us helping ourselves, you know, into our

00:40:09
obsolescence and, you know, eventual destruction as we

00:40:12
Comfort ourselves into our death.

00:40:15
But isn't it so interesting, they were even having a

00:40:16
conversation. Firming.

00:40:18
The idea that in life's most intimate moments, we actually

00:40:21
want human interaction just in case anybody.

00:40:25
Yeah, just just reminding ourselves but like I said the

00:40:29
last human, you know, Comforts The Lassie the second-to-last

00:40:32
human or vice versa you know in the a Eyes Were Watching us

00:40:35
through their screens and saying it is almost done.

00:40:38
This is such clear this this is an AI derived modification of

00:40:43
William Faulkner's Nobel acceptance speech, clearly Why?

00:40:49
So you'd like the last puny voice of humankind bringing it

00:40:52
up, you know, across the hills that I like this.

00:40:55
Like his last day. Yeah.

00:40:57
Comforts the last day. I I knew what I can add those

00:41:01
conversations that I like you. Nice there that does not have

00:41:04
found. Is that I've met I like working

00:41:06
on like practical Solutions like real problems.

00:41:09
Like not not this thing when you're gone, her friends, I

00:41:12
think. Sighs about what a generally.

00:41:19
I will mean, and I'll write, I mean, it's not just sort of the

00:41:23
mass, I mean, right? You spent a lot of time with

00:41:25
these are nine exercises? Yeah.

00:41:27
I actually don't spend a lot of time thinking about generally, I

00:41:29
to be honest because you think it's sort of just a total mind

00:41:33
game distraction or you just like don't find any, I think it

00:41:36
is a bit of a distraction to some degree like it's a bit of a

00:41:39
short term for me in terms of traction.

00:41:40
Like I've no idea when this will happen and I don't know about

00:41:44
you know, like these surveys that ask people, you know, over

00:41:47
What space of time? Do you think Jolly?

00:41:49
I can can arise the in many cases like those questions are

00:41:52
formulated in a way that presupposes, an answer and so

00:41:55
they kind of bias around a bit but you're saying the survey say

00:41:58
sooner than you think is credible or yeah to what degree?

00:42:01
I don't know. But yeah it does feel a bit

00:42:03
sooner and by the way like the date is like been getting closer

00:42:07
well and also people are incentivized to make it sound

00:42:09
like it's sooner than it is because that makes it sound like

00:42:11
a reasonable investment. What's I saw was driving car.

00:42:13
Yeah. Cars.

00:42:18
They can't predict general intelligence cool.

00:42:22
Thanks so much for coming on. Yeah, thanks for having me.

00:42:25
And Tom are we wrapping on this or you want to do more?

00:42:28
Sure. If you guys wanna stick around,

00:42:29
we can spend a few minutes on the email.

00:42:30
We can do that. But Nathan, thanks so much for

00:42:33
coming on. Really, really appreciate great

00:42:35
to talk to you. Thanks again, thanks.

00:42:37
Thank you. Do you want to spend a few

00:42:42
minutes on the email or do you a rabbit as a foot email that he

00:42:46
sent about pronoun? Yeah, so this Was the, I guess

00:42:48
now, former CEO of mail, mail chimp mail came.

00:42:52
The reason I wanted to go through it is not because, you

00:42:55
know, I mean, the email itself I thought was pretty hysterical,

00:42:57
but also it does touch on a few things that I've actually been

00:43:00
themes on the show before, and so beyond, just like laughing at

00:43:03
this guy, for sending a 1500 word email to is.

00:43:06
She was, you know, she never liked any HR person, Eddie HR

00:43:11
person would have like, throwing themselves off a building to

00:43:13
stop it from happening, but you know, no one, no one tells these

00:43:16
things to CEOs. I don't know how dedicated you

00:43:19
think these HR people or their companies that yeah, yeah,

00:43:22
clearly a couple, you know, like I'm email marketing firm has

00:43:26
acquired by insults like HR person.

00:43:28
If you're thinking about throwing yourself off, a

00:43:29
building on behalf of your company, you need to wake up.

00:43:32
Let me just read through the email, we could just reflect it

00:43:34
for a little bit and then if we see, it's getting longer, it's

00:43:37
boring because just we could just call the episode.

00:43:39
Okay, so again, this is an email that was sent by the now, ousted

00:43:43
CEO MailChimp, which is an email marketing firm.

00:43:46
The story was broken by Former. And so we shiver who's the

00:43:49
writer there? So this is the email, sorry, the

00:43:52
guy's name is something Chestnut or something?

00:43:55
I don't know him. By the way, big Chestnut,

00:43:58
Chestnut. No.

00:44:00
Hi team. I've been really impressed by

00:44:02
how well the new employee onboarding is be going lately.

00:44:04
We're bringing on so many new peeps.

00:44:06
Oh, yeah, that's the thing. In the email.

00:44:07
He calls people peeps. The whole time through.

00:44:10
We're bringing on so many new peeps and in turn, they're

00:44:12
bringing on their own great questions and making the chats

00:44:14
very Lively. Kudos, I want to take a quick

00:44:17
moment. So lightly recalibrate something

00:44:19
before it goes too far. This is where it starts.

00:44:22
I've never read this. So I'm like, experiencing this

00:44:25
life, okay? I'm sure Katie hasn't either

00:44:27
because she has other things to do, I am noticing that whenever

00:44:31
new employees introduced themselves in Zoom before asking

00:44:33
your question, they're also announcing their pronouns.

00:44:37
This is completely unnecessary when a woman parentheses, who

00:44:40
was clearly a woman to tell us that her pronouns are quotes she

00:44:44
/ her and a man parentheses who was clearly a man to tell.

00:44:47
Tell us that his pronouns are quote, he / him.

00:44:51
However, if there is an employee with gender dysphoria, in the

00:44:53
room, who feels more comfortable, this is coming from

00:44:56
the CEO. By the way, to all of the, like,

00:44:58
all the employees of the company, just want to make that

00:45:01
clear. There's an employee with gender

00:45:02
dysphoria in the room who feels more comfortable if we know

00:45:04
about and use a non-obvious pronoun.

00:45:07
For them on obvious means that they might appear to be one

00:45:09
gender to others. But in their minds, they

00:45:11
consider themselves to be another gender.

00:45:13
They are very welcome to Proclaim that pronoun to others

00:45:15
in the room and for the record. It is my desire.

00:45:17
That MailChimp is a respectful place that will honor that

00:45:19
request, the name of inclusion. So basically like the guy is

00:45:22
trying to explain himself and they why is writing this email

00:45:26
and do it in a way that's very thoughtful and, you know, he's

00:45:29
not trying to say, I have a direct issue with people

00:45:31
claiming their one gender or another, you know, it's going to

00:45:33
be very, you know, whatever. But this is where the problems

00:45:36
kind of cut, crop out the next paragraph.

00:45:38
It seems as though there is a very kind and compassionate

00:45:40
intention by someone somewhere in onboarding to accommodate our

00:45:44
co-workers who use non-obvious pronouns, but making them feel

00:45:47
comfortable Durable enough to announce their pronouns.

00:45:49
Indeed, an intimidating thing to do in front the crap.

00:45:52
What a mess. The logic seems to be that if

00:45:56
everyone else is announcing their pronouns, and that is the

00:45:59
logic. I know where you're going with

00:46:01
this dude. And that is exactly the logic.

00:46:04
Yeah, we all see what will logic becomes a key word here as

00:46:07
you'll soon. Find out the logic seems to be

00:46:09
that. If everyone else is announcing

00:46:10
their pronouns, then they are making it easier and more

00:46:12
comfortable for the trans. Last gender-fluid employee to

00:46:15
announce their own, that is truly kind.

00:46:17
And I We love that intention. Yeah, but in the law, but here's

00:46:20
the but right, so far, so good. But in the long run, this

00:46:25
approach does more harm than good.

00:46:28
There are there are three reasons for this first, there is

00:46:33
a tiny there's a very tiny number of peeps at MailChimp.

00:46:36
Who would consider themselves transgender forcing either with

00:46:39
orders or through guilt, approximately 1390, other peeps,

00:46:43
to adopt the new communication Paradigm that Humanity has never

00:46:47
had to use. In our 300, Your Existence

00:46:50
and in our hundred and fifty thousand years of spoken

00:46:52
language. I don't know where those numbers

00:46:53
and we would definitely not want any peeps who've not yet

00:46:56
publicly identified as such to feel comfortable doing.

00:46:59
So we want to keep that number, really small here MailChimp.

00:47:02
We don't want any more people feel uncomfortable talking about

00:47:05
their gender because it goes against three hundred thousand

00:47:07
years of tradition. In order to make things

00:47:10
additions that were bad that we get rid of, I'm gonna throw

00:47:13
slavery out there as one, but continue.

00:47:15
Yeah. They never, you know, in ancient

00:47:16
Sumeria, they Announce their pronouns and I think we should

00:47:19
honor it in order to make things slightly more comfortable.

00:47:23
For an extremely small group of peeps is completely illogical

00:47:26
group. They were trying to keep a small

00:47:27
possible. Yeah, that's Rising say like

00:47:29
very small transgendered people to.

00:47:31
They're all just little tiny little tiny, you know.

00:47:34
Anyways, what's the harm? Well, why did with the harm?

00:47:38
Yes, you may be asking yourself, you know, 300 words into this

00:47:41
email, what is the harm or what is the purpose of this email?

00:47:44
You will now find out. Well, I believe that whenever

00:47:47
One is forced to comply with something that they know is

00:47:49
illogical. No matter how kind the intention

00:47:52
they will eventually believe anything and do anything even if

00:47:56
it's vicious. We're undermining logic and

00:47:58
reason which undermines independent thinking Which

00:48:01
history has shown always leads to disastrous consequences

00:48:05
forcing, majority of peeps to behave.

00:48:07
A certain way is the opposite of inclusion.

00:48:10
So basically at this point he decides to go like slippery

00:48:12
slope with the whole argument and say like if we start

00:48:15
announcing people's pronouns in meetings, when it's only a small

00:48:17
number Of people, we are bringing about the ruination of

00:48:20
civilization. Wow, because never in history,

00:48:24
even recent history. Have we asked the majority of

00:48:28
people who do not get agree with unchanging social Norm to

00:48:33
comply? We've never done.

00:48:36
Yeah, I just don't know how to write like interracial marriage.

00:48:42
We've never, we've never tried to pave the way for those

00:48:45
things, socially through things like Language and legislation.

00:48:50
One of my reactions to this, which is very like somewhere in

00:48:54
this sort of management space. It just feels like if he has an

00:48:58
issue with this and the best lever he has is to reach out to

00:49:02
the whole company instead of trying to get his subordinates

00:49:06
or whatever in line and saying, this is how we want to handle

00:49:09
onboarding. It's sort of a miss, it shows

00:49:12
like a lack of a handle over the company and sort of like, hey,

00:49:16
wouldn't he want? Does speak to other people

00:49:19
before. He sent this email to see if ya

00:49:21
ain't nothing. Get your deputies to agree with

00:49:24
you or yeah. Yeah.

00:49:26
Well, he seems to be pinpointing it on some process and

00:49:28
onboarding, which is like an HR function.

00:49:31
And yes, it would seem like if you have to have this

00:49:33
conversation because it is just fucking killing you, all the

00:49:36
illogic that's going on because it's only tiny transgendered,

00:49:39
people that should be announcing their pronouns.

00:49:41
You could handle this in a smaller group that every single

00:49:44
employee at the company, I am personally, very skeptical that

00:49:48
this pronouns announcing thing is going to stick in our

00:49:52
culture. I am not like going to be one of

00:49:56
these people like protesting it, but I just don't, I feel like

00:50:00
already the discussion, like, on my Tick, Tock feeds among like

00:50:04
progressives is like, is it good to be centering gender?

00:50:08
So much in like our introductory conversations and Katie to your

00:50:12
point. They like, sure, maybe you're

00:50:15
making it easier for people to come out because Ask you, but

00:50:18
you're also pressuring people to make a gender statement, like

00:50:22
one of the first things they say to everyone.

00:50:24
So, I just think even in the world of like just Progressive

00:50:28
argumentation, I am not sold on the fact that these gender

00:50:34
intros are going to stick and I do think it's reflective of

00:50:37
extremely heavy handed HR like progressivism, which is the

00:50:43
worst form of progressive culture, like LinkedIn,

00:50:46
basically forcing everyone. One to put their gender and

00:50:49
their LinkedIn is not an eye-opening thing.

00:50:51
It is exactly the sort of like statist force, morality of the

00:50:55
left that no one likes and will not win people over.

00:50:59
So while I agree that these people protest way too loudly

00:51:02
about like gender shit and like who cares about having to see

00:51:06
her pronouns but I do think the instincts of it are like I'm not

00:51:11
sure it's a winning issue for the left.

00:51:13
I mean I don't think that it will stay around forever but not

00:51:16
for the reason you've said I mean, Like I think that it's

00:51:18
something that's happening now, in order to pave the way for a

00:51:22
group of people who do not feel comfortable to feel comfortable

00:51:26
and that once they feel more comfortable and we don't need to

00:51:29
have this happen. It won't happen.

00:51:32
Once people are trained to just like not washing machine.

00:51:36
Just like we've all been trained, not to assume that

00:51:37
somebody is married to a woman just because I'm a man.

00:51:40
So, when I meet a man, I don't say oh, how's your wife?

00:51:43
If I see a ring because I don't assume that he's married to a

00:51:46
woman. It took a really long time to

00:51:49
get there but now we're all trained.

00:51:50
And so some of the linguistic things that it took to get

00:51:54
there, have also faded away and I will say, you know, as

00:51:57
somebody who's friends with now, multiple parents of children,

00:52:02
who do not identify with the sex that they were born with.

00:52:05
It is really, really painful. I mean, like this is not, this

00:52:09
is it's really difficult. And so I think that you're right

00:52:12
that we won't always have to send her gender in this way.

00:52:15
But that there's a reason why it's happening.

00:52:17
You're saying the Miss gendering is painful or the whole extra

00:52:20
thing. The whole topic, I think that

00:52:23
there's a lot of the topic is inheriting that there's a lot of

00:52:26
pain to go around and it goes beyond just simply misgendering

00:52:29
and so it's not only to make people who are having questions,

00:52:35
other gender or gender non-conforming feel comfortable.

00:52:37
It's also it make their families feel comfortable and makes their

00:52:40
parents feel like malt. My kid is entering a world, you

00:52:44
know? I mean, I think that one of the

00:52:46
things that they fear is that their children, I'll be beaten

00:52:48
up or harmed I mean I think any so any acknowledgement that kind

00:52:52
of seems even if it's your point it doesn't work right?

00:52:56
Anything that creates some feeling of like the world.

00:52:59
I mean it's always crazy volume because conservatives are like

00:53:03
you want to be performative Lee kind to everybody like what's

00:53:07
wrong with you. Yeah it is a thing.

00:53:13
It's like God. Like, just like I was going to

00:53:19
say just like, I think it's good that we don't assume that every

00:53:22
woman who walks in wearing a wedding ring is married to a man

00:53:25
and that it's totally and that we had to be kind of

00:53:28
performative. Lee kind to get Society to that

00:53:32
place. We did it's clearly like an

00:53:33
awkward part in the like you know movement towards being a

00:53:36
more inclusive and Kinder gentler world but actually a lot

00:53:38
of the topics that you guys are bringing up come up in the next

00:53:40
couple paragraphs. And I really want to bring it

00:53:43
up. Oh, there's like 20 more.

00:53:45
I was the end. Oh my God.

00:53:46
Oh my God, no. No, no, no, right.

00:53:48
Have one point the more points you have the more people can be

00:53:51
mad about. Right.

00:53:52
Okay. So that was that was a slippery.

00:53:54
Slope argument. That what we're doing is we're

00:53:56
descending into world of a logic and like soon, we'll have like

00:53:58
fucking ants wearing hats because it doesn't make sense,

00:54:01
you know, for people to announce their gender pronouns.

00:54:02
If it's very obvious, what their gender is, okay?

00:54:05
Second and by direct one-on-one conversations with a small

00:54:08
subset of that, small population of transgender employees.

00:54:11
Let me emphasize again. These are very, very small.

00:54:13
People. I have found that they don't

00:54:17
even need I want all this accommodation.

00:54:19
All right. This is this is interesting,

00:54:21
they don't and I'm sure I'm assuming if these people exist

00:54:24
and he's not making this whole thing up.

00:54:25
This is interesting, there is an employee who started as a woman

00:54:29
but the transition into a man during transition, he politely

00:54:32
came to me and other leaders. And respectfully asked us all to

00:54:34
honor their transition by using new pronouns, it was our

00:54:37
pleasure to honor that request. He now uses he/him pronouns has

00:54:41
used, the men's restrooms has never wanted a gender neutral

00:54:43
restroom, and additionally has worked damn hard to earn a new

00:54:46
career. His New place in life and most

00:54:48
important, I'm sure has achieved peace in his mind.

00:54:51
Just providing a place where they could learn a living and do

00:54:54
good hard, meaningful work, helped him find inner peace.

00:54:57
And in fact, it's happening at MailChimp is a little weird side

00:55:00
point, but every company takes this section, I'm not disturbed

00:55:04
by, I'm no on the contrary, I think this is his most, this is

00:55:07
a most, like, accommodating and like well-intentioned part.

00:55:10
Well, it's all supposed to be well, intentioned, but the one

00:55:11
that like actually makes the most sense, right?

00:55:13
Because it's based on real people, not like your, I think

00:55:15
that's what's frustrating about some of these culture.

00:55:17
We're issues is like, well, when faced with a real moral decision

00:55:20
around a specific person, I feel like I acted morally.

00:55:24
And yet I get yelled at, by the HR department, like certainly I

00:55:27
think a lot of us are sympathetic to that kind of

00:55:30
right point of view, right? So, I don't want to read through

00:55:32
this whole paragraph, but you get the point.

00:55:33
Basically, the CEO is saying, I've talked to a few transgender

00:55:36
employees that we have here and they've all xu4 specifically

00:55:39
requested that we don't do this because it's uncomfortable for

00:55:42
them and I want to honor that. So it's like, okay, okay,

00:55:44
interest well good good, there's a different way to do.

00:55:47
This email, I'm seeing it right now, but continue.

00:55:50
Yeah, okay, so here's where things.

00:55:51
Get very interesting to me. Third, this used to be about

00:55:54
fostering, a creative productive, work environment

00:55:57
with that intention in mind Dan. And I on Diana's, have always

00:56:01
wanted MailChimp to be an inclusive meritocracy a place

00:56:05
where no matter your lifestyle gender race, nationality, or

00:56:07
economic background, you could be an independent thinker and

00:56:10
speak up. Not only would you in feel

00:56:12
emboldened to speak up, your fellow peeps would listen and

00:56:15
take your customer Centric. Advice it was.

00:56:17
The name. Yeah it was all in the name of

00:56:21
work but now everything is incredibly politicized.

00:56:24
That's probably true lesson. I long for the days when I could

00:56:28
have a workplace, it's not completely.

00:56:30
Yes, I got that. This is the part where like, you

00:56:33
know, his argument is verging. It's uncomfortable territories.

00:56:36
But I actually probably agree with right?

00:56:38
I am finding that peeps. No longer feel motivated by

00:56:41
meaningful work. They are motivated to make

00:56:43
political statements that is definitely true.

00:56:46
Yes. And now I'm in my arms.

00:56:47
I'm sympathetic with a lot of what this guy's saying, so I get

00:56:50
to the really difficult part. But yeah, because this is like

00:56:53
every but that's always a minority.

00:56:55
I mean, it's a because vocal minority who feels motivated by

00:56:58
like protest. The most people do just want to

00:57:00
clock in and clock out, do your fucking job, right?

00:57:02
Like, even if meritocracy is a fucking farce, like it's a

00:57:06
necessary fake belief of capitalism like, I'm sorry.

00:57:11
Okay. Let me finish up this paragraph

00:57:12
because that I want to unpack them or they're using company

00:57:15
time and Company resources to win a game.

00:57:17
Their opponents in that game, that is Raging in their minds

00:57:20
and on social media understandably.

00:57:23
So our society is becoming increasingly divided and it

00:57:26
feels truly like our social fabric is being torn apart at

00:57:29
the seams by radical Politics. On both sides coercing, peeps

00:57:33
into proclaiming their pronouns is not about creating an

00:57:35
inclusive creative productive work environment, it's about

00:57:38
becoming a political statement. The only thing I would have to

00:57:41
say to that, by the way is like it's a very brief political

00:57:44
statement. You know.

00:57:45
Like if you really are all about political statements just do

00:57:47
like More land acknowledgements and shit.

00:57:49
Like those things take a long time you know, you're spending a

00:57:51
lot more time on that one than just saying.

00:57:53
Like my name is Tom. He him but but whatever.

00:57:56
As righteous as some peeps might think that that is it should all

00:57:59
keep singing really does make me really?

00:58:01
Really makes it undermines everything hit him.

00:58:05
Yeah, yeah. There's really no, there's no

00:58:07
coming back from that personally, but hard as

00:58:10
righteous as some people might think that this is, they should

00:58:12
also consider that there are others in this world.

00:58:14
And on the opposite end of the political Spectrum, who feel

00:58:17
Righteous about their beliefs understanding and respecting

00:58:20
that fundamental concept at grown adults can have different

00:58:22
views as a part of being American and part of being a

00:58:25
mature, adult peeps of all the different political leanings are

00:58:29
free to vote the way that they want to and see our country

00:58:31
governed is he basically trying to say that saying, pronouns, is

00:58:35
triggering conservatives and the company and making them feel

00:58:40
political whenever this comes up, right, right.

00:58:43
I don't even need to read any more.

00:58:44
You guys get the point of what he's saying here in this

00:58:46
paragraph and like there The reason that I liked this, this

00:58:49
part of it is because this gets to two things that we talked

00:58:51
about a ton on the show, which is that trans issues, which were

00:58:56
always talking. Yeah.

00:59:17
Environment the fact that I, you know, and you probably see this

00:59:20
board, the then we'll certainly more than Eric does a little

00:59:22
more than I do Katie but like yeah, thanks a lot with a

00:59:24
sizing, his work environment. It's just an employee of watch

00:59:29
looking. Yeah, I haven't play now.

00:59:31
We're going to start sending these emails out.

00:59:32
Eric. Yeah, it's all your shopping

00:59:34
with us before you hit. Send dear nukes.

00:59:37
You know, is that what you call your employees who are now we

00:59:42
do? Yeah.

00:59:45
So yeah, I know, it's this idea that like workplace is Is

00:59:47
becoming this Battleground for some employees to.

00:59:51
Yeah, I agree with the CEO here. Like I do think there are very

00:59:53
vocal. I'm sure minorities but people

00:59:56
that are trying to you know, and I think minority not because I

01:00:01
think these issues are important but simply because even if you

01:00:05
look at like big workplace, I think the Washington Post did a

01:00:09
great story on this like the Starbucks employees moving to

01:00:12
unionize. It is a real thing and it's

01:00:14
really important and I I think that this is a real movement

01:00:19
that has legs and it's not just something made up at the same

01:00:24
time. It's clear.

01:00:25
Even from that story, that many, many people are just clocking in

01:00:28
because they need money and they're like, really not

01:00:31
engaging, you know, there's like maybe I'll wake up one day and

01:00:34
be a unit employee mean Army. Final agonies class presidents

01:00:38
trying to push right, right. But there's the line between the

01:00:41
Trader Joe's in the Starbucks employees and what's going on.

01:00:43
The most of these companies were there all white collar workers.

01:00:45
And, you know, the ones that are very vocal are trying to, I

01:00:49
don't know, realized they're policia on college campuses to

01:00:52
where the students are trying to unionize the employees.

01:00:56
But it's like, these students are leaving after four years and

01:00:58
the people who are working in security or her working in

01:01:02
dining services. They are not leaving after four

01:01:05
years. So you have it's not a

01:01:06
professional class, but it's a group of students who feel very

01:01:09
passionate. And I understand why I'm not

01:01:12
saying they're wrong. I'm just saying that the

01:01:14
incentives are very different for these two groups of people

01:01:16
and people will A and the people who will go, right, but it is

01:01:20
very distracting inside these companies.

01:01:22
I mean, every Google now and that company is like, borderline

01:01:25
paralyzed, like, certain departments that are because of

01:01:29
the activism and outspokenness by certain people at the

01:01:32
company. And I'm not saying it's a good

01:01:34
or bad thing. I'm saying, it's a reality and

01:01:36
it's also just the result of years.

01:01:39
And years of all of these companies, telling their

01:01:41
employees that your personal beliefs should be wrapped up in

01:01:43
the mission of this company. And that what you stand for is

01:01:47
what you are. Working on.

01:01:48
And this was always going to happen.

01:01:50
In my opinion, there's always gonna be a point where people

01:01:52
got disillusioned by the mission and felt way my personal

01:01:54
beliefs. If the only place where I can

01:01:56
express, my personal beliefs is at the company that I work, then

01:01:59
I used to spend all of my time making sure that everyone knows

01:02:02
how I feel because otherwise it doesn't make any sense to me and

01:02:05
sort of like Elite white-collar left has become very content

01:02:10
with like statements of solidarity as some like major

01:02:14
like political Victory instead of staying.

01:02:17
Oak is done actually like I don't know, material conditions,

01:02:21
or right, you know, actual political achievements.

01:02:24
They're like, yeah, I know you don't but get on them.

01:02:28
No, I'm like if you guys want to be activists in your company's,

01:02:31
you should be demanding your companies to pay their fucking

01:02:33
full Freight and corporate taxes, but that's just oh yeah.

01:02:36
You know, my mom, that's my favorite thing.

01:02:39
I think anyone has ever said, in this podcast, when you're like

01:02:41
apple, not enough taxes. The biggest political issue of

01:02:45
our time. Is for the government to like do

01:02:58
cool functions, like whatever you're right?

01:03:03
Because all the Apple employees that, you know, spent months and

01:03:05
months, complaining about, you know, whatever culture issues,

01:03:07
they had of the company. I don't think a single one was

01:03:09
like, why are we incorporated and Ireland?

01:03:11
Right. Right.

01:03:13
And why can't we enforce voting rights?

01:03:16
Like the ones we have on the Looks like why does it take so

01:03:19
long? Why is her Criminal Justice

01:03:20
System ground to a halt? Like, why are there not enough

01:03:23
people to investigate? White-collar crime, why?

01:03:25
Yeah, yeah. And so, like, you know, the

01:03:27
whole discussion about gender pronouns on both sides, I mean,

01:03:30
look, I yeah, you really came into this Tom.

01:03:32
Like, we were just going to eviscerate this email.

01:03:34
I actually don't think this podcast is like, so happy to

01:03:39
eviscerate on the grounds of, like, what a dumb fucking thing

01:03:41
to write, like he should have actually, he'd said, he

01:03:44
interviewed people who are Transit his company got another

01:03:47
Our thoughts and written an email that started.

01:03:49
I've had really important conversations with members of

01:03:53
our community and no peeps members our community.

01:03:56
And I want to know whether or not there is ways to figure out.

01:04:01
If there's a way to figure out, you know, why we are centering

01:04:05
gender with the use of pronouns. There is some discomfort coming

01:04:08
from the very group of people were hoping to make comfortable

01:04:11
and so what does that mean? Can we discuss it as a community

01:04:14
and hope with a different plan? And that's also a much shorter

01:04:17
email. That email can be a communist

01:04:18
words for paragraphs, for short paragraph, and the takeaway

01:04:22
should have been there. Something deeply wrong with the

01:04:25
human resources, professional. We need to, we need to get rid

01:04:31
of these in the family hired for legal ability to be sure.

01:04:35
Because because because here's the thing, Katie, if that was

01:04:37
actually the problem here, he would have sent that email and

01:04:40
it would have gotten accomplished something that

01:04:42
would have actually benefited. The very, very small, we have to

01:04:45
include that number of trans people at If it's like the Lord

01:04:49
and you're trying to keep it small buddy, right?

01:04:51
I got a great job that's clearly not.

01:04:53
What's going on here. He was triggered by the use of

01:04:57
these. The private use of these words,

01:04:58
he's like using the small number of trans people at MailChimp to

01:05:01
hide behind because he's pissed about something which is also

01:05:04
like oh God I mean just it's so fucked up but like why didn't he

01:05:10
call us to write this email for him?

01:05:11
Why didn't he give us a ring? We could have done this.

01:05:14
Yeah I mean it goes into the sort of Who is politicizing what

01:05:19
between the left and right, right?

01:05:20
Like the left is like, I don't know, we're just seeing people's

01:05:23
pronouns and then we like have a fucking meeting and the right

01:05:26
obviously, reacts very negatively and then it's sort of

01:05:30
hard to say which move is the politicization, right?

01:05:34
Like if his claim is that he's not being political, right?

01:05:36
He's trying to remove politics from the wolf, but he's

01:05:39
basically saying that When people have to say pronouns and

01:05:43
people on the right have to do it there, they're feeling like

01:05:46
it's a political act and they're basically being forced to sort

01:05:51
of, you know, go against their principles by doing something

01:05:54
that they yeah, bristle. At.

01:05:56
And we've seen different versions of this playing out in

01:05:58
Tech with like, the CEO of Kraken and like his super based

01:06:01
work culture where he wanted people to be expressing.

01:06:04
Only dark web ideas. And if you don't like it, you

01:06:06
can leave. And you know, the Brian

01:06:08
Armstrong at coinbase stuff. I mean, it always very tight.

01:06:11
Yup. It was all very tied up in

01:06:13
crypto, but like, it is a very real thing that's happening at

01:06:15
these tech companies and the decision by CEOs to claim we can

01:06:19
be non-political by sending out emails like this is it

01:06:22
completely wrong headed, way of doing it.

01:06:24
Of course it's going to backfire and the MailChimp guy he stepped

01:06:27
down. Yeah, he's gone.

01:06:28
He's gone. And we don't know if it was

01:06:29
because of this but like was it right after, you know, the story

01:06:33
didn't, which I think was a bit of a shortcoming with the story.

01:06:35
It didn't really explain why but that, you know, they had gotten

01:06:38
acquired by into it. I think he actually made quite a

01:06:40
bit of From that whole transaction.

01:06:43
He's to do this bullshit. He can have his own.

01:06:46
Like, he can have all his staff. He can do whatever he wants.

01:06:49
Never tell them their gender if you want some.

01:06:51
Yeah. Hey Candace.

01:06:52
Have no staff. He's like, I'm moving to a model

01:06:55
where I had no staff. No people who messy.

01:06:58
No HR department's. I mean people stepping down and

01:07:03
like overreacting to employee revolts is another part of the,

01:07:07
I mean, I think that sort of calm down somewhat.

01:07:09
Like, I would be interested to know Now, if you was really

01:07:12
ousted over the street, I wouldn't make sense.

01:07:14
I mean, the email was, I can't imagine why anyone should be

01:07:19
fired over this. Although does to Eric's earlier

01:07:22
Point indicate that he's a bad manager and so I wouldn't be

01:07:26
shocked if you scratch the surface beyond, the email will

01:07:30
find stuff that right kind of weird.

01:07:32
Right. Clearly don't.

01:07:32
Let's go advice go to your Executives and say, are we

01:07:35
aligned with this is like out of control?

01:07:37
I'm sure most of them would be like, listen, we're trying to

01:07:40
meet our Sales quota in. This is not like you driving

01:07:45
decision, like, who cares? Like, we're going so fast

01:07:48
please. Right.

01:07:49
Exactly. But he's like, but he's trying

01:07:53
to say here, you know that, like, all of the pronoun

01:07:55
discussion is distracting us from a real fulfilling work of

01:07:58
running MailChimp and like, he's just like, I got a way to fix

01:08:01
it. We're not getting our numbers.

01:08:03
You actually may be the bigger issue.

01:08:05
Is, you're only offering people MailChimp in the same way that,

01:08:08
you know, like the fed, you know, the only thing they can do

01:08:10
to control. Roll inflation, is by raising

01:08:12
interest rates. He's like the only thing it can

01:08:14
do to increase. Productivity is sending 2

01:08:16
were emails. Eviscerating, our pronoun policy

01:08:20
and he's like that's it for the only trick.

01:08:21
I got my bad focus. Our company more than a divisive

01:08:26
email from. I remember Habsburg seen

01:08:29
companies really cook together, running around the shitty mess

01:08:32
of your boss is evil. They would say something is

01:08:35
saying, I would spend half the day being like what the fuck

01:08:38
does this mean? And everybody would like just

01:08:39
ignore it like who cares? Is like those emails of the most

01:08:43
distressed. There's like, why am I worrying

01:08:45
for overlords? That are so out of touch with

01:08:47
like the core product that we deliver?

01:08:50
I mean talk about like the use cases for AI.

01:08:53
How could there not be like a super Advanced clippy on all

01:08:56
boss emails that says like, it seems like you're writing a very

01:08:59
ill conceived. Email about what pronouns there

01:09:01
were employees. Are you sure you want to send it

01:09:05
this exists? Did you not see this?

01:09:06
I saw it on Twitter. Somebody ran this email through

01:09:09
some like I don't know. I woke censorship app and it was

01:09:13
gave this like a 0%. Like we're 14 just like a lot.

01:09:18
Like a common sense out of anything like this is it's a

01:09:24
little weird that software exists to be.

01:09:26
Like this is what I'm saying is, was that indeed the software, or

01:09:30
was it just like, I don't know, I was so we don't we don't know

01:09:34
what this software was, but yeah, there's not much more to

01:09:38
say about this than like, you know, Ears up for everyone at

01:09:42
MailChimp. One thing that I'm interested to

01:09:44
talk about that. I don't think we should get into

01:09:47
this episode, but I do want to sort of just plant.

01:09:49
The seed is getting to a point where our political debates we

01:09:52
can, like, shrug our shoulders on some of the more we've

01:09:55
basically gotten into a political culture where we've

01:09:59
tried to like amplify the importance of every political

01:10:02
issue. And So then whenever there's

01:10:04
disagreement and feels like a real, I don't know, right?

01:10:08
Severing point and There aren't these issues where it's like,

01:10:13
people just say to each other. Yeah, I disagree with you.

01:10:16
I don't care about this issue that much, you know, you would

01:10:18
be seen as like, obviously like a bigot or anti-trans to say,

01:10:22
like, this is below my line like Chemung.

01:10:25
I mean term off basically saying there's like a hierarchy of

01:10:28
political views he cares about and one was was low.

01:10:31
I mean people went ballistic, you ultimately survived, but

01:10:34
there does have to be a true ranking of issues.

01:10:38
You're willing to like put your whole life.

01:10:41
My own and I think it can be different for everyone like you

01:10:44
can get in trouble for not prioritizing.

01:10:46
I mean, I think that I think what your mouth got in trouble

01:10:49
for was coming across as an excessive dick?

01:10:51
I mean, I think and I think I was criticism and I think that

01:10:54
most people prioritize social issues that are swirling around

01:10:59
us based on what's important and pertinent to their lives.

01:11:01
I don't think that they may be like, staggering them and write

01:11:04
a blog post about it, like he might, but I think if you are,

01:11:09
for example, I mean, The way I grew up, you know, growing up in

01:11:13
the 80s and 90s with not very much money in a dying

01:11:17
blue-collar town. You know, there were a lot of

01:11:20
social issues happening at that time, but the ones that were

01:11:22
most important for like, the people I knew were economic

01:11:25
issues. One like there was obviously a

01:11:28
lot happening with gay rights like it was the 80s and 90s.

01:11:31
There were people being beaten to death and that was for some

01:11:35
people a lower priority issue because we didn't know anyone

01:11:37
who was gay or for people who are closeted in my town.

01:11:41
That was the highest priority issue.

01:11:43
I think that's fine. That's totally fine.

01:11:45
And I don't think that we should ever have to declare what our

01:11:48
high priority is. She was and why and defend that

01:11:51
like that makes literally no sense to me.

01:11:54
Well there's also no way back to this email that like for the CEO

01:11:58
in the people that were annoyed for whatever reason by having to

01:12:00
say their gender pronouns. That it was a high priority

01:12:03
issue for them. It's like you're using

01:12:05
MailChimp. Your highest priority issue.

01:12:07
Dude, should be like, why do people use this product that?

01:12:11
Like has the most dumb-looking user interface ever.

01:12:14
Like I feel like I'm in kindergarten when I use it but

01:12:16
like nobody really likes but you know it's like maybe that's your

01:12:20
high priority that you're getting your business and being

01:12:23
stolen by, you know, sub stack. I would just other and look,

01:12:27
maybe the CEO had no ability to answer those questions and so

01:12:30
for him he felt like he will like the battle that I can.

01:12:32
I be waged as for what about gender prove, that that would be

01:12:35
a great. I would love the case to be the

01:12:37
guys like fuck. Like I have no vision for This

01:12:41
product just naturally like this, right?

01:12:44
It's like the Republicans basically decided like our

01:12:46
vision is completely popular solution.

01:12:49
We fight about some other shit. Like that's that's so if that's

01:12:55
the truth here, that is some like 11 degrees of Chess, I

01:13:00
would love that that I would love to see.

01:13:02
Oh, this guy that was just like the board comes, never like all

01:13:06
right. What's the 12 year?

01:13:08
Plan to tell people that it's like Welsh or worse?

01:13:11
Basically all we have left are internal culture wars that we

01:13:14
adjudicate / /. Yeah.

01:13:16
And you know, hopefully I can bring up some name recognition

01:13:19
because these emails will leak and you know people remember

01:13:22
MailChimp because the last thing I thought about was when they

01:13:24
advertised on cereal and for those people who love parlor,

01:13:28
they will use MailChimp over the based emailing platform and I

01:13:33
don't know you. I think you might be a writer, I

01:13:36
think there are some multiple layers at this guy strategy

01:13:38
because aside from that it seems thin broth.

01:13:41
Anyway, we went there. All right, this was fun.

01:13:43
All right, fine. Thanks Gary.

01:13:45
Thanks goodbye, goodbye.

01:13:59
Goodbye, goodbye. Goodbye goodbye.

01:14:02
Goodbye.