Emperor Zuck (w/Deepa Seetharaman)
Newcomer PodJanuary 05, 202201:10:0664.19 MB

Emperor Zuck (w/Deepa Seetharaman)

Deepa Seetharaman is a longtime friend, Wall Street Journal tech reporter, and — most importantly — a committed Dead Cat podcast listener. Her ears have been burning as we’ve talked about her and her colleagues reporting with former Facebook Chief Security Officer Alex Stamos and as we’ve dissected her reporting on Instagram’s influence on teenage girls in our episode “The Facebook Philes.” And given the fact that we named this podcast after Mark Zuckerberg’s strange text messages with board member Marc Andreessen, we thought it was about time we brought on someone who actually regularly writes about Facebook to talk about the state of the company as it is under siege from whistleblower Frances Haugen and the media.

Katie Benner, Tom Dotan, and I talk to Seetharaman about the Journal’s Facebook Files series, Mark Zuckerberg’s ever increasing control over the company he co-founded, and what Seetharaman knows about Zuckerberg’s relationship these days with Sheryl Sandberg and Peter Thiel.



Get full access to Newcomer at www.newcomer.co/subscribe

00:00:05
Welcome. Hey, it's our newcomer here with

00:00:15
dead cat. Katie's laughing at me.

00:00:17
She's here Tom's here. We have our friend Deepa

00:00:21
longtime friend from The Wall Street Journal, who's been

00:00:24
involved in the Facebook files. Reporting is covered Facebook

00:00:29
forever. Iran has been listening to this

00:00:32
podcast. I think with some irritation and

00:00:36
the goal of this podcast, in my mind is to solve the Facebook

00:00:41
issue. I mean, I I sort of I didn't

00:00:43
want to do it. I thought it to be like, sorry

00:00:44
for issue with us. This is like a Marc Maron

00:00:47
podcast because I thought it, I thought it was more of a therapy

00:00:50
session my cell. We good.

00:00:52
Are we good deeper? Are we good?

00:00:54
I mean, I think there's been so much Facebook coverage that.

00:00:57
Yeah, it's a fast. Lady media story also which is

00:01:01
why we've talked about it and there's a certain like oh my

00:01:05
God, is this story? Just going to go on forever when

00:01:08
at the same time it feels like nothing's changed.

00:01:10
So I want to try and structure this a little because it does

00:01:14
feel like a lot and deeper. You're the expert here, I guess

00:01:19
it's to help organize it I think a good starting point would just

00:01:22
be like in your mind if you had to give us two or three of the

00:01:26
core like Facebook, open threads.

00:01:28
Like if we have all this report, Eating obviously, you had a

00:01:31
bunch of different topics in the Facebook files.

00:01:33
If you're like, oh, the real story lines about Facebook if

00:01:37
you had to narrow down just so we can focus around something.

00:01:41
Start off with. What would you say are like the

00:01:43
key open threads? I mean, okay.

00:01:46
They're all going to be kind of obvious, because this cover this

00:01:48
company's been covered so aggressively and for so many

00:01:52
years. But like I think if I'm thinking

00:01:55
about different storylines for Facebook, that I find the most

00:01:59
interesting. Most important one.

00:02:01
It's like the consolidation of Power by Mark.

00:02:03
This is a company that is always had Mark Zuckerberg at its

00:02:06
center, right? I mean like like shareholder,

00:02:09
he's all the voting power, use a chairman, he's a CEO.

00:02:11
He's, the founder is because those cultural cachet and said

00:02:14
the company, but lately, like, if you look at all these

00:02:17
different external bodies that are supposed to check the

00:02:20
company if they don't really do much of what the oversight board

00:02:24
was likely lied to right in the process of its creation.

00:02:30
I haven't done anything. Have they publicly rebuked?

00:02:32
Mark. I don't know.

00:02:33
No, write the actual board was at one point.

00:02:37
It included people who disagreed with Mark and wanted to push

00:02:40
back against him and then they got replaced last year, right?

00:02:43
And, and, you know, there's nothing that shareholders can

00:02:47
really do if you look at all shareholders, not named Mark

00:02:50
Zuckerberg, most of them are unhappy with the current

00:02:54
governance structure, but it doesn't matter because they're

00:02:57
not named Mark Zuckerberg, right?

00:02:58
And so I think there's That is a threat.

00:03:01
I think a lot about. I think about the fact that like

00:03:03
so much power is concentrated in this one person and that's not

00:03:07
something you see in government, like necessarily, you don't

00:03:12
necessarily you don't really see that.

00:03:14
In other companies, I've covered where it's like, literally one

00:03:18
person is making all these calls and the name of this podcast,

00:03:22
obviously isn't right? Very episode of him.

00:03:27
Consolidating control. Would you relate the rise?

00:03:29
Of zuckuss emperor of Facebook to that moment that has become,

00:03:34
you know, eponymous with our show that he was able to create

00:03:37
this dual class structure that allowed him to.

00:03:40
Even if he was selling off his shares, maintain absolute voting

00:03:44
control over things. I mean, was that like the

00:03:46
villainous turn in his leadership that brought him to

00:03:49
where things are today? I think it's one of the moments,

00:03:51
right? I think there's a lot of

00:03:52
different periods in time where Mark, like collects more power.

00:03:56
I mean, like think like, I personally as a reporter when I

00:03:59
think, Worth going back to the early days where this share

00:04:03
structure was even decided, right?

00:04:05
Like this is like, you know, Sean Parker like trying to

00:04:09
convince Mark that this is that he needed to make sure he

00:04:12
retained as much power as possible.

00:04:14
Like I want, I think more. I'd love to know a little bit

00:04:17
more about those conversations and I'd love to know a little

00:04:20
bit more about like because that to me, is starting moment where

00:04:24
he just starts to wield incredible.

00:04:27
Like he always had technical power like you always Has had

00:04:30
the control of Youth of the company but then like what you

00:04:35
seen in the last couple of years is a willingness to exercise

00:04:39
that power. So before the 2016 election know

00:04:42
Mark was the majority voting shareholder, right?

00:04:45
But it was the Sheryl Sandberg as his partner in crime and

00:04:49
right and she handled all the government Affairs and all the

00:04:52
politics and she and he, where is he, like really focused on

00:04:55
the technology and then that content that's completely

00:04:57
changed in the last few years, right?

00:04:59
And each Each one of these is a little Sway and that

00:05:02
conversation the dead pack. Conversation is one of them

00:05:05
because, you know, he's convincing me.

00:05:07
Marc Andreessen is so in his Corner, he's texting him in the

00:05:11
middle of a supposedly independent board meeting about

00:05:16
like a committee meeting about like the share structure, right?

00:05:20
I mean, like, it's right, it's in, it's just incredible.

00:05:24
Yeah, and it's the infrastructure that was put in

00:05:27
place to keep at least mark. Brick specifically in throned

00:05:33
has been set and they're sort of nothing we can do about it, we

00:05:38
can dig into all these and obviously we love the Mark

00:05:41
Zuckerberg control one because it's personality and fun.

00:05:44
Yeah. Like and it's sort of, you know,

00:05:47
value neutral. So reporters can like really

00:05:49
wade into that because it's just sort of like anyway.

00:05:52
But I want to go through your sort of big, faceless narrative.

00:05:55
So, what's number two? So that's one.

00:05:57
And then the second is, I mean, one another is Facebook and the

00:06:01
continuing kind of erosion of democratic Norms, right?

00:06:05
And look, what is the role? That one's a little, that one's

00:06:08
a little murkier. There's a lot of complex reasons

00:06:10
why Democratic Norms around the world have changed and I mean

00:06:15
and casebook is a potentially contributing factor in a lot of

00:06:19
those cases. It's it's like what do you say I

00:06:22
mean what do we say at this point, right?

00:06:23
So much has been written about that but does the flood of

00:06:28
disinformation misinformation. Nation, you know what is the

00:06:32
effect that it has on society? What does it have on democracy?

00:06:36
And the way that people like facts that they believe in and

00:06:40
what they believe is real and how they vote?

00:06:42
I mean, I actually think there's that moment.

00:06:44
I was in the room for that economy conference in 2016 and

00:06:48
Tory has one, you know? Or yes, me too.

00:06:51
Hello wow. Royalty.

00:07:00
I was I felt flattered and I you know, I was I was, you know, you

00:07:08
know, see you were there when he was.

00:07:09
Like yeah, it's pretty crazy to think about.

00:07:11
Well, I have no idea what you guys are talking about.

00:07:13
I don't even, this isn't even a. So I write basically the day

00:07:17
after the election, there was a conference and Zuck was

00:07:19
scheduled to speak. So, of course, the first

00:07:21
question Zach is asked, is, do you think that Facebook played

00:07:25
any role in Donald Trump's election in his victory and

00:07:29
Zuckerberg I said, Depot, what was do you remember exactly what

00:07:31
he said? Well, I remember to the words he

00:07:33
says, I think of yes, pretty crazy.

00:07:35
Pretty crazy to think that Facebook affected the election.

00:07:39
Yes. I mean, you know, I don't

00:07:41
remember in the room, I remember the room being kind of like

00:07:43
silent and like there wasn't like some Big Lagoon, Collective

00:07:47
gasp or anything, but like externally on social media like,

00:07:50
yeah. It was like a like a bomb.

00:07:53
It was the day after the election.

00:07:54
Everyone was pretty numb for the most people are in the room.

00:07:57
Nobody had any emotional reaction to Thing and that only

00:08:01
now I remember to in terms of picking up pieces of words that

00:08:05
he said during that, I think he said that people make their

00:08:07
election voting decisions based on lived experiences rather

00:08:12
than, you know, the information that they consume online.

00:08:15
I think what was the point is, right?

00:08:17
It would take him like it would take him what seven years later

00:08:20
to acknowledge that. Facebook is a lived experience.

00:08:23
When he renamed his company, medical things at all anymore.

00:08:27
Like, that's a good boy. Yeah, yeah.

00:08:30
And I mean, I actually, in some ways, I think there is a

00:08:33
continues to be like a debate about the extent to which

00:08:36
Facebook's responsible for, for that 100%.

00:08:40
I don't think it's worth saying that he, that was that insane of

00:08:43
a statement to make. I mean, like we can laugh about

00:08:45
it. Now as we can, I don't wanna

00:08:47
interrupt you. But yeah, I don't I don't want

00:08:49
to say okay I don't want to posit that as like a completely

00:08:51
crazy thing that he said, I think this many years later if

00:08:55
there's reasonable debate to be had.

00:08:56
Yeah, on whether or not that was right now.

00:08:59
The discussion with Alex and we'll come back to that.

00:09:02
Yeah, yeah. I remember the discussion with

00:09:04
Alex. Well, then I and then, you know,

00:09:07
I think there's also another thread, it's just Facebook and

00:09:11
like, like harms to like mental health, like, the way that it

00:09:19
sort of, I mean, like the Teen Health story, which I know we're

00:09:23
going to talk about, there's also the problematic use slash

00:09:26
Tech addiction storyline, right? Like, we're people are

00:09:29
compulsive. Only using Facebook and they to

00:09:33
the point where they don't spend time with their kids, there's a

00:09:36
subset of that. There's some sort of those

00:09:38
users. That's something that came out

00:09:39
of the Facebook files documents. And so, there's just a lot of

00:09:43
how does Facebook personally harm to get the Tom's Point?

00:09:46
Like how does it personally harm the individual?

00:09:48
There you? Yeah, you.

00:09:50
Yeah. And so I think there's there's

00:09:52
that there's that, there's so many others names, but those are

00:09:55
the ones, those are the ones. I'm and some, my, some I argue

00:09:59
that theme. Yeah.

00:10:01
Two and three are related as well.

00:10:04
Yes they were all right. Now we have a even go.

00:10:07
Like this is helpful. This is a well organized as

00:10:11
review of Facebook. Yeah, it's almost so deep as

00:10:14
been thinking about for a long time, a long time, I mean, there

00:10:19
is a lot by the way, there's a lot of Prior just like three.

00:10:22
Can I ask you on these on these points as you are thinking about

00:10:26
them and doing your reporting How Deeply do?

00:10:29
Do a lot of your sources at Facebook, contemplate.

00:10:32
These things. There's such a broad range

00:10:34
right. There are definitely people who

00:10:36
think that this is all bullshit, but okay, yeah, maybe Facebook

00:10:40
had a role in Myanmar, okay, fine.

00:10:42
Yeah. Maybe, you know, disinformation

00:10:44
is bad, but we don't necessarily know how this information like

00:10:50
affects votes. I mean, okay, January 6, you

00:10:54
know, like there's definitely people who are a little bit in

00:10:55
denial and that's like kind of the mental, what their thinking,

00:10:58
their train of thought. And then there's Definitely

00:11:00
people who it's hard for them to engage in this conversation

00:11:03
because they feel like the media is out to get them.

00:11:05
All right? There's a whole media versus

00:11:07
Facebook theme to, which is, I don't think particularly

00:11:10
helpful, but it is something that a lot of people think

00:11:13
they're at, you know, there's a feeling of like, oh well the

00:11:15
media's. I've actually had Facebook

00:11:17
people argue to me before like the media is jealous of of us

00:11:24
because we make so much money and we're taking all their

00:11:27
influence and, you know, that's We got to the truth to it, but

00:11:33
it's very complicated. I don't want.

00:11:35
Yeah. I mean reporter is by their very

00:11:37
profession Believe In Like A reported intentional way of

00:11:41
knowing things about the world, they believe in like truth.

00:11:45
They believe in like fact checking and they believe that

00:11:48
people who do those things in our virtuous and that way should

00:11:52
get rewarded in some sense for it and Facebook loves reward

00:11:56
people. I mean Ryan Broderick at garbage

00:11:59
day was just I dream about, you know, these viral like the cat

00:12:02
breastfeeding episode, you know, like social media, whether it's

00:12:05
Tick-Tock or Facebook sort of thrive on pumping out things

00:12:08
that are like obviously Falls. And I think any of us is

00:12:11
reporters, it can drive us crazy because we believe in like truth

00:12:17
and we think it's bad that these platforms just in and of itself,

00:12:20
feed people stuff, that's obviously not true.

00:12:23
Just to make them angry for a second to get engagement.

00:12:26
And so, is that bias? I don't know.

00:12:29
We all share. That's literally never been a

00:12:32
motivating factor. Anything I've thought about

00:12:34
recording, we share like a worldview about like you don't

00:12:37
agree with that. No, I don't like I don't, I

00:12:41
don't think that's like something that motivates me to

00:12:42
report at all. 100% know the couple times I've written about

00:12:46
Facebook that was literally not something.

00:12:48
I thought about what did I think about.

00:12:50
Here's a list of people. I need to call.

00:12:52
Here's a list of things. I need to know.

00:12:54
I have X amount of hours to get the information, and I'm going

00:12:58
to get it. And I'm going to Roger this file

00:13:01
and someone else is going to clean it up and I hit publish.

00:13:04
That's what I thought was very, it wasn't.

00:13:06
It wasn't known as easier for like the Mark Zuckerberg sort of

00:13:09
style bucket one but like, bucket, two and three have a lot

00:13:13
of like, what are your values, your values play into it or

00:13:17
Depot. I mean, if you want them to, I

00:13:20
mean, but like deeper, when you let's go like drill and

00:13:22
specifically on the year, a member of the media and you're

00:13:25
only writing critically about us because you're jealous of our ad

00:13:29
money or Something. I mean and if you've actually

00:13:31
had these conversations because I never have a, you see it play

00:13:34
out on Twitter which is the world's worst medium to have a

00:13:37
meaningful and nuanced conversation with someone.

00:13:39
But yeah, as you are engaging with, you know, in a background

00:13:43
setting, so you can speak sort of freely with someone and they

00:13:45
and they put to you this like, you know, Canard about what the

00:13:50
media is, are you able to have like a reasonable conversation

00:13:53
with them about it and, you know, and maybe your mind is

00:13:56
changed slightly by them and vice versa.

00:13:59
It's like kind of gets down to the function of news though.

00:14:02
Like if your I had a I went to journalism school and I had this

00:14:05
professor in college, he talked about how during the Vietnam

00:14:08
war, he felt like it was newspapers responsibility to

00:14:11
consistently shove in people's faces.

00:14:14
The pictures will coffins coming home, you know like repeatedly

00:14:17
show it to them. Like he I think he used the word

00:14:19
force, it down their throats and not because they like it like

00:14:22
like news is one of those weird Industries or it's not just you

00:14:26
don't give people what they want or what they say.

00:14:28
They want you give people What? The, you know, you as an

00:14:31
institution, think they need, and if we as an institution,

00:14:34
thank you. Hey, you need to understand, and

00:14:37
highlight the harms of Technology.

00:14:40
Then I mean, that's yeah, I guess that's an editorial

00:14:44
choice. But like, you know, that's the

00:14:46
purpose of news and to highlight issues so that people can kind

00:14:50
of make their choices and understand, because they're

00:14:53
going to get PR from everywhere, right?

00:14:55
There's advertising that like, you know, if you, if you're a

00:14:59
think of Like a Facebook user, you will get a lot of

00:15:02
information about Facebook from the news.

00:15:03
But Facebook has all these other routes right there is actual

00:15:06
Facebook, right product that is used by millions of people.

00:15:10
Lots of men. I know they pump out whenever

00:15:12
Sheryl, Sandberg has some live stream.

00:15:15
I get some notification in there.

00:15:16
I am suddenly listening to her directly, it's extreme.

00:15:20
It's a huge microphone. That's my favorite show every

00:15:22
time and you know, so that they have their own platform, they

00:15:26
have advertising, they have, you know, there's still a lot of

00:15:29
places. At will, like, do interviews

00:15:31
with them that are crossed off Bali.

00:15:32
Like, I don't, you know, I don't know if it's that's not my job.

00:15:35
My job isn't to be PR. It's not your job to be like,

00:15:38
this is the number one movie in America, you know what I mean?

00:15:40
So, mainstream media, Where The Red Pill.

00:15:46
All right. So so the sort of Alex Stamos,

00:15:49
sort of democracy fight. I mean this fits into sort of

00:15:52
the media has an agenda because I think a key framing there for

00:15:57
him was this idea? Year, that Facebook was guilty

00:16:03
in that, they, you know, let the Russians like by, you know, get

00:16:09
some ads that sort of manipulated people, but that

00:16:12
campaign was much smaller than the effect of the actual hack,

00:16:17
which was sort of laundered through the media and that the

00:16:20
media was far more complicit and that the media has an interest

00:16:24
in blowing one of those two scandals out of proportion.

00:16:29
And And that's, that's sort of where we are.

00:16:31
Do you disagree with, with that argument?

00:16:34
It's like two different sets of media, though, right?

00:16:36
Like, I mean, Tech reporters who are the media is not like a blob

00:16:40
in that, like a blob, is really moves in unison, right?

00:16:45
And it's not even though we're not like look, we're not

00:16:48
organized enough to be like a conspiracy, like we don't have

00:16:53
like enough look, I don't know what my colleague who sits next

00:16:57
to me is doing. I got any at Maybe like more

00:17:01
than half the time, right more than half the time.

00:17:03
And like, not that I have any colleagues who sit next to me

00:17:06
anymore, but you know what I mean?

00:17:07
Like, I don't give your kids right now and I also don't know

00:17:09
what my kids are doing. I relate.

00:17:12
And so, I just think that, you know, I, I detect reporters were

00:17:17
focusing on, on Facebook and sort of the impact of the Russia

00:17:23
campaign, and there isn't necessarily A like, what's the

00:17:28
like those same I'm reporters weren't involved in a lot of the

00:17:32
Hillary Clinton, like, kind of the week coverage that Alex is

00:17:36
talking about, right? Like this is like where I am way

00:17:38
out of my depth like it but it's not the same individual

00:17:42
reporters. So who should who should answer

00:17:45
that? Question is so hard for me to

00:17:47
answer it but also the other thing that Alex sort of he this

00:17:51
the store if you end the story at the 2016 election, I think

00:17:55
his point is very strong but what he then goes on to say and

00:17:58
he does not link these Points is that disinformation

00:18:01
misinformation bad information, democracy.

00:18:04
Eroding information is now spread by everyday Americans.

00:18:07
We actually don't need Russia to hack.

00:18:10
Tom podesta's, email, and grab it and give it to reporters now,

00:18:16
everyday Americans are taking care of that, but the tool they

00:18:19
are using, is still Facebook. And so I under, like I said, if

00:18:24
you stop his narrative at 2016, I think it is very strong, but

00:18:28
what we do, he Aunt, do is extend the timeline to show that

00:18:32
you no longer need Russians to hack anyone because Americans

00:18:35
are taking care of it themselves, but the tool that

00:18:37
they need in order to spread that information is Facebook.

00:18:41
You know, if I can just say one thing that is probably like just

00:18:46
sort of a side point. Alex has broader point about the

00:18:48
media introspection and like how reporters need to really think

00:18:53
about the editorial choices they made.

00:18:55
I don't, it's like, to me, it's separate from Facebook's

00:18:58
responsibility, but it's also Men like true, but I agree like

00:19:01
reporters need to be constantly thinking about it, but when it

00:19:04
comes to issues and I don't think we as an industry, do

00:19:08
enough to interrogate ourselves and look at our choices.

00:19:12
Specially our editorial choices, right.

00:19:14
A lot of the criticism is the Masthead level rather than beat

00:19:17
reporter level. Yeah and we're talking to the

00:19:19
beat reporter obvious, right. But I also kind of think those

00:19:22
are separate questions from the impact of technology and

00:19:27
whenever I talk to somebody who's very defensive about Out

00:19:29
technology. They're always try to like, kind

00:19:31
of throw it to this other argument, which is the media.

00:19:33
And I'm like this is like these aren't the same topic well, but

00:19:37
do they agree with that because the media has made it.

00:19:39
I mean, like, you know, this is the argument that tech people

00:19:42
will always make is like, you can pull the audience, but the

00:19:45
audience is just a reflection of the media that they consume

00:19:47
which we've now. Agreed is the case, one way or

00:19:50
the other, then you know they can definitely say an agenda is

00:19:53
responsible for that sentiment. Well, I mean, if it is I don't

00:19:57
know. If you look at just what could

00:19:58
like, just look at consumers. Southeast will use Facebook

00:20:01
constantly. Look, I've been writing about

00:20:02
this company for almost seven years.

00:20:03
I use Facebook. I sold my couch recently on

00:20:06
Facebook Marketplace. I do best program.

00:20:11
I'm not anti Facebook, right? Like if these things aren't this

00:20:15
is its these it's not it mutually exclusive to use

00:20:20
Facebook products but also be critical of the company's harms.

00:20:23
It's like, I don't know. We all vote.

00:20:25
We live in a democracy but we can still challenge our

00:20:27
Democratic leaders. The Fect of all this reporting

00:20:31
on Facebook, is that there's like a there are people are,

00:20:34
like, shut Facebook down or it does, it does feel like, you

00:20:37
know, some of it can feel like it's so relentless, because the

00:20:41
goal is to get rid of Mark Zuckerberg or because it's like,

00:20:45
well, he's responsible. There's been no change.

00:20:47
How can you believe, you know that?

00:20:50
The company really means to change when the guy in charge.

00:20:53
Yeah, and I know, you can't say one way or the other whether you

00:20:56
think Mark Zuckerberg should be replaced.

00:20:59
Yes, but I don't know. What can you say about the

00:21:02
relationship between the fact that it's one guy and sort of

00:21:06
the relentlessness? I mean, I think that this, I

00:21:12
just think he's a crude, a lot of power and I think he's

00:21:16
probably more. He appears to be, at least from

00:21:19
the outside, right? There's a lot, we don't

00:21:21
necessarily know, but he appears from the outside to be like, you

00:21:26
know, as powerful a situation as a 10.

00:21:29
Anytime before right guy, I don't see him stepping.

00:21:33
Yeah, he literally floats above water right.

00:21:38
He's he has godley's a crude. Godlike abilities.

00:21:42
That's a, he's using a hydrofoil hydrofoil New York Times.

00:21:46
It was not. It was not a windsurf, it was it

00:21:49
was a hydrofoil, right? What are the worst Corrections?

00:21:53
I've seen in the times history and so there's there's a real

00:21:58
question of Okay. So what's been the impact of all

00:22:03
this reporting this fairly good, investigative reporting on

00:22:09
Facebook. And I think there's a lot of

00:22:11
people within one thing that I find really interesting.

00:22:14
So you guys remember Cambridge analytic about the under that

00:22:17
that all that episode unfortunately.

00:22:19
Yes and so camera J. Let It Go.

00:22:22
Was it was interesting for a lot of reasons like it.

00:22:25
I don't there is that narrative that, you know, the data Was

00:22:29
taken by. Let me see if I can remember it.

00:22:32
I haven't thought about this in a long time like Hogan and then

00:22:38
Alexander Cogan, right? That's his name, I think.

00:22:40
And then, it was given to Cambridge analytical and then

00:22:43
Cambridge analytic gotta supposedly fed that into their,

00:22:48
like, kind of their machine learning software, whatever, and

00:22:52
then were able to Target would be Trump voters or like voters

00:22:57
in general, didn't Action and flip their votes, either got

00:23:01
them to vote for Trump or got them right to not vote for

00:23:04
Hillary. And I think that feels super

00:23:07
hard to prove. Like, I don't, it's a fake

00:23:09
scandal-ridden into your times, the nobody believes anymore,

00:23:13
that they've never really apologized for.

00:23:15
I mean, that's also the guardian.

00:23:17
Eric, there are absolutely right.

00:23:19
And then when they there was a Pulitzer campaign, there was a

00:23:22
lot of drama around, which side should get credit there, but I

00:23:26
do like the the times is the only publication.

00:23:29
Remember For writing that story. I mean, that's what I read.

00:23:33
I mean, it's very influential no.

00:23:34
But that, that is a good point. That is a good point.

00:23:36
I mean, no Angry. Like, why is the times the only

00:23:39
publication remembered for writing that story?

00:23:41
Because it wasn't just the time special, it was the guardian.

00:23:43
It was the guardian for I believe won awards for it.

00:23:46
They broke the story, right? And like every other publication

00:23:51
matched is so why is the times, the only what time is it?

00:23:53
The Forefront would know. First of all, they did have, you

00:23:55
know an ultra Ultra long story? No, we had to, we had to team up

00:23:58
with the guard. DS.

00:24:00
Yeah, it's see that. You'll see that care her by

00:24:03
lines on all of the stories. I mean this is sort of similar

00:24:07
wiser - coverage of Facebook. It's like because New York Times

00:24:10
is the most powerful you know, like news Alan people hold it in

00:24:13
highest. I'm not, I'm not saying I'm not,

00:24:16
I'm not saying you shouldn't criticize.

00:24:17
I'm just wondering like, why you only want the times to

00:24:20
apologize, I trust the times 10x to the guardian so I don't care.

00:24:24
Really. If the guardian even though

00:24:26
Carol called baller is now credited with actually breaking

00:24:29
the story. Archie has a very strong.

00:24:31
She's a very opinionated report anyway, totally different.

00:24:34
Yeah, we're but we also trusted Journal equally to the time I

00:24:37
do. Trust the journal.

00:24:40
Yeah. Well what I was going to say

00:24:41
that so that part was like kind of hard to prove but what if

00:24:44
absolutely demonstrated was that there was a time when you could

00:24:47
just like kind of suck up user data from Facebook and then the

00:24:51
company infinite absolutely no idea where it went right?

00:24:54
And the end and that is a legitimate issue and I think

00:24:56
they were kind of and then they Addressed it.

00:25:00
And, you know it was like Mark, Zuckerberg took five days to

00:25:03
sort of sit down and like, understand what was going on.

00:25:05
Come up with a bunch of scenes and are fixes or whatever and

00:25:09
then, finally spoke up criticized for waiting that

00:25:11
long. I mean, there's certain certain

00:25:13
things in hindsight that are I mean like this like the five-day

00:25:17
wait how big of a scandal is it? I mean it's it's a little in the

00:25:22
like kind of as we go further pass it.

00:25:26
I don't run running was the biggest scan true skin on.

00:25:29
That whole thing, that was, what morning?

00:25:31
Remember the Facebook comms front ran the actual issue

00:25:38
because that it made them look guilty and then everybody was

00:25:40
like oh fuck Facebook was nothing reporters, hate more

00:25:43
than front running. Like yeah that what's the five

00:25:47
days they took a while to actually like figure it out and

00:25:49
come up with this substantive response to the like the kind of

00:25:52
contract. Because I remember I actually

00:25:54
remember this very well because I was in the process of planning

00:25:56
my wedding, during the Cambridge analytic has gone away.

00:25:59
It was a beautiful wedding, thank you.

00:26:01
It was also really hard to plan at a certain point.

00:26:05
I just sort of like stepped. I was like, okay I'm I can't

00:26:07
plan what you did to religions multiple days at a retreat.

00:26:12
It was wow, that's nice. Eric was actually there.

00:26:15
While I was getting ready when the published time, that is your

00:26:25
fault. And I'm so anyway, we so we just

00:26:29
Just we, you know, the actual thing that I think is really

00:26:33
important is that is the leak of data.

00:26:36
And so anyway, if you remember the Facebook response to that

00:26:39
was, okay, we're really sorry. We this is what all the things

00:26:42
we're doing around privacy, these are all the different

00:26:45
steps were going to take etcetera, etc.

00:26:47
Etc. You haven't seen that this time,

00:26:50
right? With the, with the documents

00:26:52
that right? That were published in the

00:26:55
journal and then later by a lot of other outlets, like what the

00:26:57
Whistleblower got. Like, you just You haven't seen

00:27:00
that level of apology in fact, what you see his Defiance.

00:27:03
And I think that's a reflection of can Mark gaining more power,

00:27:09
just feeling more confident, and sort of asserting what he thinks

00:27:12
is, right? And like our understanding is,

00:27:14
it is and it's been reported elsewhere.

00:27:16
Is that a lot of that is being driven through, Mark, with sort

00:27:19
of saying, I don't want apologies.

00:27:21
I want us to be tough and they happen to us when they get into

00:27:25
arguments, like I feel like they're doing our hair.

00:27:27
Like I do think there's six. Feeding is it Facebook files,

00:27:31
like, champagne. They're only Facebook files.

00:27:34
If it's published in the Wall Street Journal and anywhere

00:27:36
else. It's sparkling sparkling wine.

00:27:40
Yeah, right, this is what we called our series.

00:27:42
And then, like guess one, the Consortium got it.

00:27:45
They were like, we don't want to go on to Facebook files, but it

00:27:48
isn't it? I forget, which reporter said

00:27:50
that you don't want to call it. I forget, what do they call it?

00:27:52
They called it like the Facebook because Casey, the left over,

00:27:54
right? So they want to bring like the

00:27:57
Facebook papers. So like, Yeah, you know, some

00:28:00
people say Facebook papers Facebook, but this is evidence

00:28:03
here that media does not glued very well.

00:28:05
You know, Eddie like, yeah, so you're you know, the Facebook

00:28:10
file story, The tell us about your Instagram story.

00:28:13
We've talked about on the podcast, somewhat, maybe in ways

00:28:17
you disagree with so I don't care.

00:28:19
We've always been incredibly positive about all the journals

00:28:23
Facebook stores. I mean and a lot of like all

00:28:25
credit kind of goes to my colleague.

00:28:27
Jeff for basically convincing Francis, how good.

00:28:29
Come forward and provide all these documents and it's been

00:28:32
awesome to work with him. And he actually is funny.

00:28:34
I I was on maternity leave earlier this year and he's like,

00:28:40
hey, I can, I come over? I just want to talk to you about

00:28:42
this like project were working on and I had three month old

00:28:47
daughter at the time and like, listen, three months, you know,

00:28:49
Tom knows its base, they're like they're a lot and and so he's

00:28:54
sitting in like he's sitting there.

00:28:55
It's like post vaccine times. He's like a cliff that I really

00:28:58
want your help on this project. I'm like okay fine yeah what is

00:29:02
it is like we have a bunch of internal documents and they're

00:29:04
really revelatory and I was like okay I don't look at wasn't

00:29:08
processing at all and then my daughter vomited on me and then

00:29:12
like all these things happened and then you know key it wasn't

00:29:16
that like the moment where things click tree, like wait a

00:29:18
second. There's something that is putrid

00:29:20
and gross that lives inside an organism that when it gets out

00:29:23
says something about the world and what it means to me, if I

00:29:27
had the mental capacity to think like that, Maybe but I've had

00:29:30
nothing. I was so fried and I you know, I

00:29:32
just I didn't really appreciate what he was talking about until

00:29:36
like I came back like a few weeks later and started going

00:29:41
through the documents and I was like, this is amazing.

00:29:43
Like it was just fascinating, right?

00:29:45
And in part because you know, you asked this question that I

00:29:47
didn't really fully answer earlier about, like, what's the

00:29:49
conversation internally at the company?

00:29:52
Like, all right, and they're definitely people who are

00:29:53
defensive and feel like the media is out to get them.

00:29:56
There's also so many people there, That are really

00:30:00
thoughtful conscientious, and like trying the furnace, right?

00:30:05
There's a lot of earnestness in there, where they believe that

00:30:09
they can effect change which, you know, we can debate whether

00:30:13
that's ever possible at a company like that, but they

00:30:15
believe that they have a duty to do that.

00:30:18
And you see that in a lot of these.

00:30:20
You see that on a lot of these comments and you see this in

00:30:22
what, you know, she captured. And I think that's that thing,

00:30:26
that's meaningful. You know, it's not a monolith,

00:30:28
it to Riley. The media is not a block.

00:30:30
Facebook's not a block either, there's a lot of, right?

00:30:32
I mean there's an argument that this sort of way you're saying

00:30:34
that part of what the Facebook files are our employees inside

00:30:38
the company, sort of doing research aligned with the

00:30:42
reporters outside the company. I mean, I laugh sometimes when

00:30:45
we talk about company companies like you, we talk.

00:30:49
We talk about Facebook a lot and you write about Facebook but

00:30:52
like, who is Facebook? Because sometimes Facebook is

00:30:56
like calms people write like a lot of times that's Of the

00:30:59
ultimate because we sometimes talk about Facebook is being

00:31:02
sort of separate from Mark Zuckerberg even though he's

00:31:05
like, very in control and then, there are obviously people who

00:31:09
are doing the research that you're sort of writing about

00:31:12
positively and they're saying, that's not Facebook, so it can

00:31:16
be very confusing. Yeah.

00:31:18
Been talking about a company to, to say that Facebook's like

00:31:21
getting one point when it's yeah, yeah, I mean, like that's

00:31:26
what we that's true and we should just be a little bit

00:31:28
more. Located about what we're talking

00:31:30
about when we say Facebook right?

00:31:32
Like is that leadership, is it Zuckerberg Cheryl?

00:31:35
You know, all these different Executives that are up stopping

00:31:37
company? Are you talking about, is this

00:31:39
like a message coming from the Integrity folks, which is where

00:31:42
a lot of This research, is being conducted?

00:31:44
Is it? You got to think through the

00:31:45
differences of Instagram and WhatsApp and Facebook.

00:31:49
They all have different and discreet cultures.

00:31:50
I mean, there's a it's obviously I mean like this is this is very

00:31:54
obvious, I think but it is not a yeah it's all a single Be like

00:31:59
blah bright. Like the way all companies and

00:32:01
the way the media is it's just it's a really complex place and

00:32:05
I think one thing I've enjoyed about and I've been proud about

00:32:08
with our coverage is the fact that I think we've highlighted

00:32:11
that because we're talking about these sort of Divisions within

00:32:14
the company and in terms of how to approach certain issues and

00:32:20
how the Integrity teams that were built and tasked with,

00:32:24
protecting the company, and protecting users are often side.

00:32:29
I'd lined. You know, so I you know, has the

00:32:31
chi like in terms of impact, I'm not sure if you like the company

00:32:35
is restricting access to that kind of research now, they are

00:32:39
really cutting down on leaks. Someone was telling me that now

00:32:43
data scientists are a lot more careful about how they were

00:32:46
their conclusions. Like, they don't want to,

00:32:48
because, like, they have to think, okay, this is going to be

00:32:50
leaked. So, how do I say it?

00:32:53
And I mean, we haven't heard them pulling research, right?

00:32:57
Like felt like they're disbanding.

00:32:59
Any That study is addiction or whatever, but we don't know

00:33:01
what, what else might be going. I mean, is this central question

00:33:05
of the Facebook files? You know, is Facebook just like

00:33:09
a tobacco company? Or like, are we going to look at

00:33:12
that? I mean, do you think the Tobacco

00:33:14
Company metaphor? Because it's like very oriented

00:33:17
around, what your internal research says?

00:33:20
Yeah and how that can come back to bite you sort of as negative

00:33:25
opinion sort of evolved over time I think it's not Okay, so I

00:33:29
don't I think like the similar coverage strategy, right?

00:33:34
Like, where are you? And in part, because strategy is

00:33:37
almost too fancy of award. We don't we didn't know that

00:33:40
this data This research existed before it was like, like we had

00:33:44
all heard of a lot of This research, but like, we've nobody

00:33:48
had seen it, right? And so that parallel is

00:33:53
unplanned, I guess, but so yeah, we're looking at what the

00:33:57
company's own assessment of itself.

00:33:59
You saw this with like that like Exxon coverage.

00:34:02
I think a few years ago and how they are, scientists knew that I

00:34:07
liked about the company's effect on climate change and, you know,

00:34:09
like there's a lot of this framework and this about the

00:34:13
strategy of coverage is common in a lot of Industries.

00:34:17
Write the stories were good because it was about what

00:34:20
Facebook knew about their own behavior.

00:34:23
Not just what's happening? Yeah, and also what they, you

00:34:27
know, like that's one thing with the teen.

00:34:29
Health, and mental health stuff. You know, there is an argument

00:34:31
out there that this stuff, this research, you do these five

00:34:34
research reports, some of which included, surveys of, tens of

00:34:37
thousands of people that they weren't conclusive and fine

00:34:40
fair. But then, you know, the company

00:34:43
then then you look at what the choices the leadership made

00:34:46
like, did they then go do the more intense research did they

00:34:49
then go like, partner with any of these academics that have

00:34:54
been studying this stuff and try to get their opinions?

00:34:56
All right, we're taught we're talking around the Instagram

00:34:58
story. This point frame.

00:34:59
Yeah, and like let's let's get into the meat of the.

00:35:01
Alright. So that was the story.

00:35:03
I was involved in the initial run.

00:35:05
You know we've done a lot of stories since then but an

00:35:07
initial run, the Instagram team helped Stories the way that the

00:35:11
general conceit of the story is that Facebook had internal

00:35:16
research that Instagram had undermined the mental health of

00:35:22
a subset of teams, right? So you had not all gains but

00:35:26
like a lot of teams you had If you were experiencing body image

00:35:32
issues for one in three teens Instagram, made it worse, for if

00:35:36
you were experiencing kind of suicidal ideation, if you were,

00:35:41
if you were in that mindset for a small percentage of those

00:35:45
teens and the UK and in the u.s. Instagram, made it worse, right?

00:35:50
And and so you look at the study and they're like, there's a,

00:35:54
there's a sizable percentage of teams that are not just, it's

00:35:58
not neutral. It's like Actually harmed by

00:36:00
what they see in our platform and through the actual feed

00:36:05
through and through the explore page.

00:36:07
And you know, the story then also goes into what leadership

00:36:13
actually did about it. You know, and they, there was a

00:36:19
one particular feature that they decided to launch.

00:36:22
It was like, an opt-in feature to remove likes, right?

00:36:27
From your, from your From your feed.

00:36:30
And, you know, a lot of the researcher said, well, that

00:36:34
isn't really going to solve the problem and Instagram went

00:36:37
forward with it anyway, because and this is also from their

00:36:39
internal documents. I thought it would be a PR with,

00:36:42
so, it raises some questions. Sorry, what would be rolling

00:36:45
out? Like a optional feature to

00:36:47
remove your like, counts likely using a removing the likes.

00:36:51
Sorry. It hurt people.

00:36:53
No, so one of excel, I'm not being clear.

00:36:56
I think one research just said that removing the likes would

00:36:58
not. Do anything.

00:37:00
And so why bother and Facebook said, well the reason to bother

00:37:02
us because would be a PR one and like the idea being that, you

00:37:06
know, the hypothesis was like, if you remove likes and you

00:37:09
remove that social comparison issue, right?

00:37:11
Where you're like, comparing yourself against like somebody

00:37:14
else in your class and then if they get more likes that might

00:37:17
make you feel bad. And you know, so maybe one

00:37:20
solution was removed the likes but they found that that doesn't

00:37:22
really help if it's especially if it's optional, right?

00:37:25
But Facebook, went ahead with it anyway because I thought it

00:37:28
would look good. And so there's there's there's a

00:37:31
lot of in there about not just like the problem itself.

00:37:36
But then the way the company went about, like, quote-unquote,

00:37:42
solving it. And like the way they just sort

00:37:44
of like they sort of sidestep the issue and try to put

00:37:47
Band-Aids on on some of these on some of these problems and I

00:37:51
think I'm actually so now I'm curious like what did you guys

00:37:55
think? Like, that's so that's the story

00:37:57
that's actually generated a lot of heat.

00:37:59
Well And just yeah, I mean like the critics they all say the

00:38:02
critics point to the fact that the research wasn't as dire as

00:38:06
the headlines might have made it seem and the sales throughout

00:38:09
the entire story. And I think that what has been

00:38:12
lost for in you let me know, you let us know why you think has

00:38:15
happened. What has been lost in the

00:38:18
reading of these stories? Is this focus on what the

00:38:22
company does with information rather than whether or not.

00:38:24
The information is objectively, good or bad, whether it's

00:38:26
objectively as harmful as it could be.

00:38:29
You're not quite fitting into that.

00:38:31
I think we've said on and maybe Katie said, I forget, who but

00:38:35
does Facebook need to be clearly worse than Vogue for this story

00:38:39
to have impact. And I do think how the headlines

00:38:42
framed, it's like, oh some percentage of women feel bad

00:38:45
when they see Instagram or teenage girls, excuse me, you

00:38:48
know, is that true of Vogue? And if it's not worse than

00:38:52
Vogue, like why is that the headline or not?

00:38:55
Even just worse than Vogue but if the metaphor is going to hold

00:38:59
it, Being a tobacco company in which there is clear and

00:39:02
unarguable, medical evidence, that smoking dramatically

00:39:06
increases your likelihood of getting lung cancer, other kind

00:39:08
of negative health effects. That's not the same thing as

00:39:11
like, a subset of, you know, people, it's all smokers have a

00:39:14
higher percentage, likelihood of getting cancer and so does that

00:39:18
kind of hurt you think the the the strength of the metaphor and

00:39:22
you know, contribute to an idea of overhyping the actual

00:39:26
findings of this report? Yeah, I do.

00:39:29
Don't, you know, it's interesting.

00:39:31
I just pulled up the story because I wanted to make sure I

00:39:34
was remembering this, right? And like I've gone back and

00:39:37
reread the story because I feel like when it's been quoted back

00:39:40
to me, sometimes I don't recognize the story at all.

00:39:43
And so the story says, you know, 32 percent of teen girl, this is

00:39:47
actually quoting from the slide, right?

00:39:49
32 percent of teen girls said that when they felt bad about

00:39:52
their bodies, Instagram made them feel worse.

00:39:55
This has been quoted back to me as 13 teen girls Being destroyed

00:40:00
by Instagram, right? And that's not the same thing,

00:40:03
and I don't, you know, unlike I sort of have to step in and sort

00:40:07
of, you know, I do what I can, you know, I'm like, oh, well

00:40:10
actually that that had to do with body image issues.

00:40:12
And I think we were very responsible and careful and how

00:40:15
we trained it. We drew it directly from their

00:40:18
side presentations. What do you think about the

00:40:20
accusation that you guys selectively pulled information

00:40:24
that prove that point. But slides, that actually showed

00:40:26
there were beneficial aspects of Instagram People to feeling

00:40:29
better. That was something Alex.

00:40:30
I think brought up on it and I know others have as well.

00:40:33
I mean what's your response to that?

00:40:34
I think everything we want to be like little careful about how I

00:40:40
say this because I want the story to stand for itself and I

00:40:43
think we did a good job of expressing the individual data

00:40:47
points. But there is a distinction

00:40:48
between what we wrote in our story and then how it gets

00:40:50
interpreted by other news, outlets commentators Advocates

00:40:55
activists like that kind of thing.

00:40:57
And so there are people that I have taken that stat, the 32%

00:41:01
Scott, and, you know, they don't couch it or or, you know,

00:41:05
describe it the way it's written in both our story and the slides

00:41:09
right eye. So it's a little hard for me,

00:41:11
too. I don't know.

00:41:13
You guys have been in the situation.

00:41:15
I'm sure a number of times, like, what happens when someone

00:41:18
is saying? Hey, this version of the story

00:41:21
that's been circulating is wrong, okay?

00:41:25
I really exciting. We've all experienced as

00:41:27
reporters is some subject hates the story and they're like, this

00:41:30
is false and you're like, well, I've been through it with

00:41:33
columns every line of it is factually, true.

00:41:36
Like, you guys aren't contesting on anything any individual

00:41:40
person, calms or a lawyer would think is false.

00:41:44
You know, like isn't that the definition of what truth and

00:41:46
falsity is the like all the claims are true and then the

00:41:51
subjects and the public are like, well, it reads false to me

00:41:56
and I do think we sometimes, you know, for a lot of my Career.

00:41:59
I would get lost in the journalistic side of it, but I

00:42:02
do think there's there's there's a valid point that if, like,

00:42:05
people feel like your article is leading them somewhere, right?

00:42:09
Given other facts. They wouldn't believe then

00:42:12
there's like a falsity even if obviously all the moves are fair

00:42:17
and I as a reporter, obviously I'm sympathetic it's terrible to

00:42:19
say some things like false when the lot when it's truly

00:42:22
constructed and fair and the journal in particular is very

00:42:25
judicious in a way that other outlets aren't.

00:42:28
I mean you You read back the Thera, no story, which we've

00:42:30
been talking about. Obviously, it really speaks, you

00:42:33
know, at the time I'm sure I would have criticized it for

00:42:36
pulling its punches. A hated win when Bloomberg would

00:42:39
do this, but there is something to be said for the restraint in

00:42:43
positioning stuff and then which may be played into, you know,

00:42:46
Francis has reason to lie, specifically to the journal.

00:42:48
I think she's mentioned that, that she thought that Jeff was a

00:42:51
pretty even-handed operator, but I want to go back for a second

00:42:54
on the Facebook front because I think what a lot of the wife as

00:42:58
we get such an interesting Company on this topic is that

00:43:01
there is so much that people personally feel about the

00:43:04
company at this point that it almost doesn't matter how nuance

00:43:07
and even-handed your story is because once certain conclusions

00:43:10
that are in there, get interpreted by the people and

00:43:13
get mixed in with their preconceived, notions about

00:43:16
things. It's just, it's out of our

00:43:18
control as the media and and it really struck me when I was,

00:43:21
when I was prepping for the episode with Alex and I went

00:43:24
back and I read all the stories about Russia gate that you were

00:43:27
by lined on many of And I found the stories to be very

00:43:31
even-handed and and not alarmist and just sort of laying out what

00:43:34
was in Facebook's report that they were going to be

00:43:37
demonstrating to to Congress. And that's the same of the New

00:43:40
York Times and the journal and the garden, all these places.

00:43:44
But then we see an article by a very smart journalists columnist

00:43:49
Margaret Sullivan, who completely Miss interprets, the

00:43:52
conclusion of that report who we love.

00:43:54
I feel like we've had to criticize her but most of her

00:43:56
work, well, we think Great. I said she's smart.

00:44:00
Like I think she's great, but she misinterpreted the report

00:44:03
and said, we need to come to the realization that had Facebook

00:44:06
not have allowed you know, whatever mishegoss on the part

00:44:10
of the, you know, Russians to post like you know, spend eighty

00:44:14
thousand dollars promoting fault.

00:44:16
Fake articles that Donald Trump would not be president, which is

00:44:18
not something that Facebook's report ever said, there's no

00:44:20
backing of that. And I think what happened here,

00:44:23
and what probably happened so often with Facebook, is that a

00:44:26
well-reasoned and nuanced story? Gets out there and gets, you

00:44:32
know, mixed up in all of the personal issues that people have

00:44:35
with the government, with Facebook, with Mark Zuckerberg,

00:44:38
with whatever. And and I think it's and I think

00:44:41
this is what people don't recognize about the media.

00:44:43
When I say people, I mean, like Tech critics is that there's

00:44:45
only so much that we can do, because once the sort of stories

00:44:48
out there, it's out of our control, it was always amazing

00:44:50
to me, you know, I covered a lot of uber, I wrote a lot of the

00:44:54
negative stories about Uber, and then I come across people that

00:44:57
had such Such a Negative view of Hubert, I didn't share, you know

00:45:03
what I mean? Like it is just amazing with

00:45:06
like, you're not. I mean, obviously there are

00:45:08
other things and people have different sort of moral calculus

00:45:12
as but it is interesting that like people can read your body

00:45:15
of work and then come to just a much more like yeah, like a

00:45:20
finite and a vehement sort of just an activist crowd, has a

00:45:26
different sentiment than like reporters do.

00:45:29
Do and read this stuff was such like a lens.

00:45:33
I think what Tom said, it's sort of speaks to two things.

00:45:36
One, the idea of the media, the problem with the media and the

00:45:39
reason I hate the moniker, the media is that it clumps

00:45:44
reporters and columnists into one group.

00:45:47
When reporters and columnists are almost always working at

00:45:49
Cross purposes, right? Call us.

00:45:51
Call columnists, like, I had this debate with a friend of

00:45:54
mine, who's a, he's a columnist, and a thought leader here in

00:45:57
Washington DC. And he has like very little

00:46:00
regard for reporting. He thinks it's dumb.

00:46:02
He thinks that he's the person who controls the conversation,

00:46:04
which is fine, because actually I kind of agree with this

00:46:06
assessment. But the thing is, is he wouldn't

00:46:09
be able to control a conversation.

00:46:10
If reporters were not mining facts and bringing them to him.

00:46:13
To converse about that said, his job is something that hurts and

00:46:22
I would say hurts his job. What it will do is, we'll take

00:46:25
attention away from the facts on the ground because he is running

00:46:28
it through his Worldview machine.

00:46:30
So the idea that the media is this monolith includes both

00:46:35
commentary and Reporting. I think it's horrifying, but

00:46:38
there seems to be, like, no way to change it like we can like

00:46:41
what do we do to undo this knot in the age of Twitter too?

00:46:44
Yeah, right. It's like we're all the media.

00:46:46
It's like that Gary stronger book, super sad.

00:46:48
True love, story, are you Media or you credit?

00:46:51
Like, it's like this horrifying World anyway.

00:46:54
And then, the other thing is gas.

00:46:56
So you have this like view of the media.

00:46:58
And then, I think the His view on Facebook, running it through

00:47:01
that machine. One of the reasons why we then

00:47:03
start depending on these very lazy comparisons.

00:47:06
Like the tobacco industry comparison is because it's like

00:47:09
one that everyone in the media seems to be able to be

00:47:13
comfortable with because there might be some reporting to Bear

00:47:16
it out and it's like, easy for Abed writers to kind of use

00:47:20
because it's so visceral when they're writing a column, when

00:47:23
the tobacco, when those sorts of comparisons are like incredibly

00:47:27
lazy and make zero sense, Like at all, you know, tobacco

00:47:30
company made cigarettes and push them to people Facebook, created

00:47:34
some technology. But the product is made by Deepa

00:47:37
and me and Tom and Eric, we're the ones pushing toxic shit to

00:47:42
one another. So the only way to you can stop

00:47:45
to back the tobacco industry, by shutting down their factories.

00:47:47
The only way to stop Facebook is to get all of us off the

00:47:50
platform. That's totally different.

00:47:52
Well, I don't know about that. Yeah.

00:47:55
Right algorithm I think, is, is a complex a fire there?

00:47:58
I mean, like, yeah. Yeah, absolutely.

00:48:00
The algorithm has a lot of control, but at the end of the

00:48:02
day that people creating the real excitement.

00:48:04
I actually think, I mean, deep is reacting to this, but Katie's

00:48:06
position is very far from. I think deeply you're just say,

00:48:11
simply that it's not. I think that might be giving

00:48:15
Facebook too much credit. I mean there is this really

00:48:19
fascinating story that was in the original series.

00:48:22
I wasn't involved in my colleague Keech road and it was

00:48:25
a, there's this Eastern European political party that after face,

00:48:29
Book changed. Like it's news feed algorithm.

00:48:31
They complain to the company and said listen, like before it was

00:48:34
like 5050 - positive but ever since you've changed the

00:48:37
algorithm like we can't, we like the positive stuff isn't

00:48:40
reaching. So we've now shifted to, I think

00:48:42
80/20 and like that's an example of like that's not coming,

00:48:47
that's a party responding to the to, you know, whatever the

00:48:52
record books, whatever the news feed algorithm is promoting,

00:48:55
right? You look at these top Facebook

00:48:57
content, producers. I mean, they're all spam.

00:48:59
It's right. It's clear that if you want to

00:49:01
get engagement on Facebook, I mean, buzz buzz.

00:49:04
Feeds whole Insight was like chasing the algorithm.

00:49:08
You know what? Like I mean obviously there.

00:49:10
Oh yeah, I'm not, I'm not I guess what I meant by.

00:49:12
That was just that by having that kind of lazy comparison,

00:49:15
what it does is it distracts us from any kind of real solution

00:49:19
because the comparison implies a solution that does not apply to

00:49:22
this industry. Well I don't think the

00:49:25
comparison is the solution for tobacco was like well just shut

00:49:28
down. In the factories and don't make

00:49:30
the cigarettes anymore. So if you're if you're going

00:49:33
down that road, well that's not the sky, that solution was that

00:49:37
we make the tobacco companies, try to wean people off their

00:49:40
product and there is a similarity know that the

00:49:42
solution was actually making the product too expensive.

00:49:45
The solution was making the product very, very expensive.

00:49:47
That was the solution was a market solution.

00:49:49
Well, making the product make it less addictive, which Facebook

00:49:51
could do. There are a lot of thing.

00:49:53
I like, but I like them, but they don't, they never made

00:49:55
tobacco's. They never made tobacco.

00:49:57
Less addictive, trust me. Will you?

00:49:59
Is regulated, right? Well, the amount of nicotine was

00:50:04
was brought down but the never made it less addictive.

00:50:07
That is a ridiculous thing to say.

00:50:09
But the idea that using the metaphor will get you to

00:50:12
Solution on social media, I think is just wrong.

00:50:16
It's wrong. And yeah, you know, this story

00:50:19
and particular, I think really sparked those comparisons, but I

00:50:24
just think it's an opening. Look, we want to know more,

00:50:27
right? Like I could you guys introduce

00:50:29
This that metaphor, is that in the?

00:50:31
I do not think we did introduce that metaphor.

00:50:35
It was an enterprising. Op-ed columnist, who read the

00:50:38
story? Yeah.

00:50:39
If you did, we can take it out. But, you know, it's so

00:50:42
Blumenthal set it in his, in a comment to us.

00:50:47
Marc benioff, said it before you guys even realize, remember, oh,

00:50:50
that's pretty awesome. I mean, I don't know if it was

00:50:52
the first one but his was after Trump's election.

00:50:55
He because, you know, there was this period where all these Tech

00:50:57
leaders that were not social. A company's per say we're trying

00:51:00
to differentiate themselves from me.

00:51:02
Tim Cook, most memorably but benioff also which we I think

00:51:06
under plays the fact that you know, Salesforce is my Twitter.

00:51:10
Yeah. What else was he just a super

00:51:12
addictive product. I mean, I love myself can't get

00:51:15
enough of my sales for us but um, but yeah, I know, right?

00:51:20
They were gonna be in that game. He was gonna buy the worst one

00:51:22
of them all there. We get a by like the absolute

00:51:24
like pure distilled Cesspool of the internet, which is I'm gonna

00:51:29
ask her cascoon doing question. I feel like we reporters You

00:51:34
know, I don't know. There's a strong, especially if

00:51:37
you work for a big Outlet to not want to say like that.

00:51:40
The story is fully like are tight like you wouldn't change

00:51:43
anything. Like there are a lot of

00:51:45
institutional. I do think of failure of media

00:51:47
is, there's a lot of institutional pressure to say,

00:51:49
like stand by every word of the story.

00:51:52
We don't change, you know? But is there anything in the

00:51:54
Instagram story that you would do differently?

00:51:57
Now, sort of just watching all play out knowing that you

00:52:00
probably can't say yes but I mean I do think it's an issue

00:52:04
with Media that we will never say yes.

00:52:07
I don't, you know, I wish I was, I wish had come across more.

00:52:11
That I can, I can I look, I wish it and come across at this was

00:52:14
based on, not just like one or two internal studies, but

00:52:18
several, I wish it went to come across more, was that?

00:52:22
Yeah. There's a range of studies.

00:52:23
Some of them were small, some of them bow included, tens of

00:52:26
thousands of people. I wish that it really is somehow

00:52:30
like the idea that we wasn't our our findings, right?

00:52:35
Like we aren't quoting ourselves and we're not put, we're using

00:52:40
the researchers descriptions. This is what they think it is.

00:52:45
I wish that I could write it sort of a hard response.

00:52:47
Facebook's is sort of like, it's our research and it's bad,

00:52:51
right? That sort of their defense,

00:52:52
right? And it's like, then why in

00:52:54
should you be headlining research that they think is,

00:52:57
like, pretty weak, but do they think it's weak?

00:53:00
And I guess the thing is, is that we're looking at five

00:53:03
different studies, and I think it's also that point in this One

00:53:06
Direction that's where I and I thought this is my thing.

00:53:12
I that I really wish to come across.

00:53:14
It's just like it. Like even if the Steve's, even

00:53:16
if all these studies are really inconclusive.

00:53:18
What's really important is what how does the company respond?

00:53:21
How does the company like push get decide to to address these

00:53:27
issues, maybe they do a big deep study and they find it's a

00:53:30
really small number of people, but you remember that even in

00:53:33
their Public statements afterwards or like you know even

00:53:36
one person who feels that this all started on on Instagram is

00:53:40
too many right. So like there's their standard

00:53:43
is quite high right? They're not.

00:53:46
You know it's not and how they're responding is shutting

00:53:49
down all the teams right or just for our reader.

00:53:51
Well they outdid the controls right.

00:53:53
I mean that happened a couple weeks ago but that's something

00:53:55
they could have done years ago. Yeah yeah so they do control

00:53:59
type. If a pause they put a pause on

00:54:02
Instagram kids although they haven't committed to a We

00:54:05
killing it, right? So they're still thinking of

00:54:06
going forward with that, but they're going to go slow and you

00:54:12
know, Adam Masseria was in front of Congress and there's there's

00:54:14
something there have been some changes, they've been some

00:54:17
discussions, you're seeing the story, the critical part of how

00:54:20
they responded just to spell that out.

00:54:23
The part that we've outlined in our story is when presented with

00:54:27
different options. They picked an option that

00:54:28
didn't necessarily have the biggest mental health impact,

00:54:32
but did have a PR impact. Hi.

00:54:34
Ding likes, right? But there's a way to frame that

00:54:37
as sort of they're doing with the media in Katie's version

00:54:41
where it's like this chattering class.

00:54:43
That's not very informed, not like necessary with the beat

00:54:45
reporters would say they should do what.

00:54:50
So right. I mean in some ways like the

00:54:52
public and the media chattering class would have thought,

00:54:55
removing likes was the best idea and in some ways the media

00:54:59
pushes them to to what seems intuitively useful rather.

00:55:04
What scientists researchers or smartphone?

00:55:07
Right? Think is useful and I mean,

00:55:09
that's like all the points I said that, just now that I

00:55:12
wanted that, I wish you'd come across more work in the story

00:55:15
but there and there's a lot of nuance in the store and there's

00:55:17
a I was also surprised by the internal research that showed

00:55:20
that removing likes had. No real wasn't really gonna make

00:55:24
an impact of that was counterintuitive to me.

00:55:26
Yeah. But it means that's the

00:55:29
conclusion and you know, I just wish that I do wish as even

00:55:33
though it's part of our story. Tori.

00:55:34
It's like the part that didn't like a lot of people focus on

00:55:36
other aspects of it. I think those are really

00:55:38
important and I don't want them to get lost.

00:55:40
Right? And, you know, I mean, to finish

00:55:42
up on the tobacco thing too. I think there's, there's

00:55:44
actually nuance and bringing it up not again because Facebook

00:55:48
causes cancer. But again, because this was

00:55:50
internal research, that the company ended up sort of

00:55:53
ignoring because it was inconvenient to their, to their

00:55:56
bottom line, or it'll to them. And again, I think that that's

00:56:00
one of those things where it actually is a good metaphor in

00:56:02
that sense. But it's not.

00:56:03
It's not that The accepted interpretation of it which

00:56:06
almost makes you think. It's maybe not a great metaphor

00:56:09
but I do want to make sure that we get to one quick thing here

00:56:11
because it's super relevant the Deepa.

00:56:13
So what's your thoughts on Francis Haugen and and her

00:56:17
becoming of a media, a figure because you guys had her

00:56:20
Anonymous or the research was given to you by an anonymous

00:56:23
person and she is the engine behind all the reporting behind

00:56:27
it. And then on the last day she

00:56:29
sort of like steps forward and it's going to be testifying for

00:56:31
Congress and now you know is on the The, I don't know if the

00:56:34
cover but had, you know, photo shoot for Vogue and she's got a

00:56:37
rival book with the Wall Street Journal now, they're

00:56:39
competitors. I don't think that's not.

00:56:45
Yeah, because the Journal of the journal won't Explore her like

00:56:48
beliefs and crypto. For example, were you surprised

00:56:52
that she decided to come forward in the way that she decided to

00:56:55
come forward? And how do you think her

00:56:57
becoming a figure impacted the the interpretation of the

00:57:01
studies that she made? Yeah, I was, I surprised by the

00:57:08
way, she came forward. I mean, I, I, it's what is the

00:57:12
playbook for coming forward after you, you know?

00:57:16
Like, is there a parallel? Like, I guess I didn't, I didn't

00:57:19
know what to expect. Yeah, okay.

00:57:20
I mean, I didn't really know what to expect.

00:57:22
I didn't really know that like Chelsea Manning also.

00:57:25
What, what it would look like for somebody who had what kind

00:57:31
of copied all these different documents, and And like how she

00:57:36
would? Yeah, I got it.

00:57:37
I didn't really know how she would come forward and think

00:57:40
about, and present herself. So I didn't it's hard to say

00:57:44
whether or not anything surprised me because I like I

00:57:45
would just truly just had no expectations.

00:57:47
I don't, I don't, you know, think I am your second question.

00:57:52
I'm just thinking, like, how does it change the effect of the

00:57:56
actual? And I've talked to, I've been

00:57:59
talking to a lot of people, what kind of activists who are on the

00:58:03
left and people who are on the right, and a lot of people who

00:58:09
believe that Facebook needs to be broken up.

00:58:11
And one thing that's been very interesting is, like, at least

00:58:14
when they're talking to me, what, there's a distinction

00:58:16
being drawn between her the figure and the documents you.

00:58:21
Right? Like there are she doesn't think

00:58:23
that Facebook needs to be broken up.

00:58:25
That's not, you know, she actually thinks that would be

00:58:27
bad for society, etc, etc. She should make sure like a

00:58:31
complicated figure for a lot of the maximalist.

00:58:33
Exactly. And, you know, antitrust

00:58:36
legislation is like one of those things that there's by part.

00:58:40
There are plenty of, by people in both the right and the left

00:58:44
who think that Facebook should be broken up, should be reined

00:58:47
in all these different ways. And, and You know, she doesn't

00:58:51
comport with those beliefs and I think that there is a like so I

00:58:56
think that that that is that makes it really complicated but

00:58:59
like the actual documents themselves and that's what

00:59:02
distinguishes her I think from basically everybody else has

00:59:04
come, there been a lot of Facebook dissenters she's come

00:59:06
out with with her own opinions and a description of her

00:59:11
experiences which I think are really interesting but the but

00:59:14
she's got documents. There are a lot of hard evidence

00:59:18
of what company internal companies.

00:59:20
And so like around these issues and I think plenty of people

00:59:23
just separate that they're like, you know what, she's that's

00:59:26
great. Thank you so much for the

00:59:27
documents. Did she betray The Wall Street

00:59:29
Journal? I think that Francis had always

00:59:31
wanted these documents to go out there and to be kind of read and

00:59:39
interpreted and understood by as many people as possible.

00:59:43
I'm pretty grateful. She gave it to us first and let

00:59:47
us run with it for as long as we did.

00:59:51
I mean it's a fun conversation but like you know, think deeply

00:59:55
about some of these Facebook issues on just pure personality

00:59:58
level. Just like some fun Facebook.

01:00:01
Like I know, obviously if you had the full picture, there

01:00:05
would be a profile of this, maybe you have to hold back some

01:00:07
but like your general sense of the Mark and Cheryl relationship

01:00:12
and then Marc Andreessen, and Peter teal to the extent, you

01:00:16
can you have any observations about the role they play on the

01:00:19
board just like the Fascinating People and what you're sort of

01:00:22
gut sense of how, how involved they are right now, what you're,

01:00:26
what you would tell us. Yeah, I mean I've been reporting

01:00:30
on both of those topics and it does seem like and the times has

01:00:35
done some good reporting on the to.

01:00:36
It's sort of the Mark and Cheryl partnership was always like the

01:00:42
co-ceo thing. And then it feels like after the

01:00:45
2016 election. There was just a loss of a loss

01:00:48
of trust or something. There is a fee.

01:00:50
Feeling of, oh well politics is supposed to be Cheryl's job,

01:00:54
right? And she maybe dropped the ball,

01:00:56
but is that a reasonable expectation and picked the wrong

01:00:59
team and was going to leave for the Hillary Clinton rice?

01:01:04
And this was, this is like the Dark Universe for her.

01:01:07
Like you're not supposed to happen this The Darkest Timeline

01:01:11
and obviously I celebrate her for picking the wrong team, you

01:01:14
know? I mean but as a pure sort of

01:01:18
cynical sort of calculating move it.

01:01:20
Was a loser and but there is a real question to be asked, like

01:01:24
why was that her job? Why was it all left up to her

01:01:27
anyway, right? Like that, that isn't really a

01:01:29
realistic way of structuring a company.

01:01:31
And so what I think you've seen is rather than like a co-ceo

01:01:34
thing. She's just become one of the

01:01:36
Lieutenant's. One of the the one of the things

01:01:39
that we got through the documentation is, Like or charts

01:01:47
right? Like see ability to see.

01:01:49
So one of the so we did this graphic in October that showed

01:01:54
that over time Cheryl's, share of the company has actually

01:01:58
shrunk right. She's got cheese to have like a

01:02:00
lot of people, I don't remember the numbers anymore, but like

01:02:03
she's to have a big chunk of the company and like, over the last

01:02:06
five years it's gotten smaller. And like what does that tell you

01:02:09
about their Partnerships and right?

01:02:10
And so that I think that that era of the market Cheryl is sort

01:02:16
of over and it's now just its Mark right there moments.

01:02:20
During the Cambridge analytic, that's saying we're Cheryl would

01:02:22
tell her friends. We reported this, I think other

01:02:24
people reported this. She thought she was thing get

01:02:26
fired because Mark was really angry about the way in, which

01:02:32
the company was being portrayed in in media and press whatever.

01:02:38
And I think that's the cell is just like interesting to me

01:02:42
because like there's a lot of things that Facebook app.

01:02:45
Currently thinks her like a lot of Facebook executive thing for

01:02:47
like, comms problems. And like, hey, if you'd only

01:02:50
just like explain this properly, then the Press would get it, and

01:02:55
then they would get off her back, but Some of these issues

01:02:59
are not comms problems. A lot of these issues our

01:03:01
product issues, right? And like there's nothing I don't

01:03:04
know what a common person can really do to like to solve for

01:03:08
product issue and I think there's like that continuing

01:03:11
tension continued that is there but you know and I don't know.

01:03:17
It just does feel like Cheryl has just defeated a lot like

01:03:21
she's really receded. She's in the back, she's not his

01:03:23
coat like partnership anymore. At some point during the last

01:03:26
few years, she's just say something.

01:03:28
Started to say something, or at least people started to tell me.

01:03:30
She was saying things like, I serve at the pleasure of Mark

01:03:37
sad. That's always a good.

01:03:38
Like he's yeah. I mean that's basically what

01:03:41
advisors say which right that's her role.

01:03:43
Now is incredibly diminished from the lean in days.

01:03:48
Do you have a Peter teal Marc Andreessen take?

01:03:51
Oh yeah. They teal in particular, I don't

01:03:54
have like the strongest read on on Entry sins, roll.

01:03:58
It does feel like he's I mean just from the title of your

01:04:01
podcast, you know. He's a Mark Zuckerberg booster

01:04:04
right now while they're well. They're deep in web three

01:04:07
though. At like Chris Dixon is on

01:04:08
Twitter shitting on web to while Marc Andreessen is on the board

01:04:13
of Facebook, they don't name Facebook, but to me, there is a

01:04:16
sort of, I mean, that's the magic of Andresen that somehow

01:04:20
don't know. Now, we're seeing Jack Dorsey

01:04:21
and people sort of fight back against it but that they could

01:04:24
simultaneously have been responsible or heavily involved.

01:04:27
And in web to while arguing, that web 3 is an indictment of

01:04:32
web to is mystified using Facebook, also trying to

01:04:36
graduate like, isn't aren't there, right?

01:04:38
Like I don't know, but it may not be a lot and it's, that's

01:04:42
the kind of thing that I think Marc Andreessen has been has

01:04:45
also just kind of gotten away. Not, you know, he said a lot of

01:04:49
things that do remember the whole India thing.

01:04:50
Yeah. Oh yeah.

01:04:52
Now where he's like a something about well, like like think it's

01:04:55
a, oh, how did Independence workout for your economy India?

01:04:59
Like, when, when Facebook's internet.org got cut, like,

01:05:03
kicked off the kicked out of the country and ain't there, he had

01:05:07
to like deleted asshole Twitter and then more storms are ecology

01:05:11
Market to make an apology. And you don't know core things

01:05:13
like did Marc Andreessen support Donald Trump?

01:05:16
Like, I feel like we don't have, there isn't a definitive answer?

01:05:20
Like, what? I don't know that isn't he is

01:05:21
avoided Idea to an extent that we don't have clear answers

01:05:25
about huge, huge things about what about teal though?

01:05:30
What do you have on him? I think Tails interesting.

01:05:32
Yeah but he'll we do at morita-ya to is definitely

01:05:35
interesting teals one of these like there's always like rumor

01:05:39
as kind of circulating the teal wants to leave the board.

01:05:42
You know that eels like a disillusion but he stays on the

01:05:46
board and he is and he is like as far as our reporting goes,

01:05:54
he's been influential, right? I mean, he is incorrect.

01:05:58
There was a point when Mark had our Facebook had a couple of

01:06:04
board members who wanted to create like an independent

01:06:06
committee that would report to the board.

01:06:09
And I kind of think through all the different, like, risks and

01:06:11
issues and whatever that Facebook was going to, that

01:06:15
might impair or Facebook. And, you know, because the board

01:06:19
said, no, right? Because this wouldn't report to

01:06:21
mark, it would report to this. Pennant committee would be

01:06:23
outside of his control deal. You know, are reporting in our

01:06:27
understanding is a big reason why Mark has been a big

01:06:30
encourager of Mark, maintaining that kind of control and making

01:06:34
sure that he's sort of at the center of everything right?

01:06:38
He doesn't. And I think there's that's very

01:06:41
powerful, interesting somewhat mystifying relationship it

01:06:44
because I'd love to know more. So it's interesting too because

01:06:47
I've heard the, you know, teal is going to be moving here to

01:06:49
Washington, and he does still have A lot of influence that he

01:06:53
managed to grow during the Trump era here with a lot of people

01:06:58
who will be making decisions about what to do around social

01:07:02
media companies. And I wonder if that will change

01:07:04
his relationship with Mark improve his relationship with

01:07:09
Mark, make them closer. If he continues to try to become

01:07:12
a Washington player that was Max Shopkin who we've also had on

01:07:15
this podcast his argument was yeah that Peter Thiel is only

01:07:19
going to become more politically influential that this is The

01:07:22
beginning. I'm not sure I'm convinced of

01:07:26
that yet but it's a provocative thesis.

01:07:29
Well if you're a Mark Zuckerberg, why not just keep

01:07:32
him close because that's that could be a very valuable

01:07:34
relationship for you. And you know Peter also remember

01:07:37
Peter was there at the beginning a like he knew Mark way back

01:07:40
when right? He's the first investor one of

01:07:42
the First Investors in Facebook and big believer and legitimize.

01:07:46
Ur of the the company and Mark is loyal.

01:07:49
Like I think he he appreciates and I I think values loyalty and

01:07:54
I think that factors into their relationship as well.

01:07:57
Again, these are sources, who described him.

01:08:00
Directly, I am not talk to Mark directly about this relationship

01:08:04
so and 20, let's actually end on that note.

01:08:06
So when's the last time you talked to Mark, what's what was

01:08:10
your sense of sense of the man? And the way he views things,

01:08:14
it's been a really long time, you know, I think when he talks

01:08:21
to, I didn't see you in the Any of the Instagram videos.

01:08:24
So you have been invited to go. Water are skiing with them, I'm

01:08:28
sure just got lost. What do I think of him?

01:08:31
I mean, I think he's, I don't know.

01:08:33
He's not like, I never. So I know that there's this,

01:08:36
this reputation for him, like, as a robotic and stuff, but I

01:08:41
actually never, I don't know if I experienced that.

01:08:44
Thanks so much for coming on, much like Cheryl scenario.

01:08:47
We serve at your pleasure as our friend, so thank you.

01:08:52
For thank you for doing this a loyal listener.

01:08:56
They yeah, love listening. I actually really do love

01:08:59
listening to this podcast and I don't say that about a lot of

01:09:01
podcasts which is interesting, given the fact they've had

01:09:04
multiple of gas and we ourselves have dissected so many of the

01:09:07
articles that you have done. And so I appreciate that you've

01:09:11
continued to listen. Yeah.

01:09:13
Now no. Now she gets the opportunity,

01:09:15
you know, when you're like reading a story and you're like

01:09:17
shouting into nothing about like your response to it.

01:09:20
Now she is actually like I mean, they're definitely have been a

01:09:24
couple of moments where it like there is a like, you know, we're

01:09:27
someone was saying like well did the Instagram thing is based on

01:09:30
only like a couple of studies or small.

01:09:32
I wanted to be like, no, it was like five or six studies, one of

01:09:37
them, tens of thousands of people, but whatever.

01:09:40
So that's me shouting in fear now.

01:09:42
Thank you so much stupa. Thank you.

01:09:44
Yeah. Thanks for having me guys.

01:09:58
Goodbye. Goodbye.

01:10:00
Goodbye, goodbye, goodbye. Goodbye.

01:09:58
Goodbye. Goodbye.

01:10:00
Goodbye, goodbye, goodbye. Goodbye.