The Media's Facebook Deflection (w/Alex Stamos)
Newcomer PodNovember 23, 202100:48:1744.21 MB

The Media's Facebook Deflection (w/Alex Stamos)

We talked to former Facebook Chief Security Officer Alex Stamos about what the media got right and wrong about its coverage of Facebook’s influence on the 2016 election. Stamos — who played a key role in bringing information to Robert Mueller about Russian election interference — is someone who is willing to criticize his former employer without letting the media off the hook.

Stamos argues that Facebook inadequately addressed misinformation posted onto Facebook’s platform and downplayed its discovery of Russian election interference on its platform. BUT, Stamos argues that the media played a far larger role in helping Russian election interference by gleefully publishing stolen Democratic National Committee emails.

We decided to check in with Stamos as the credibility of the Steele dossier has continued to unravel.

We talk about the media’s failure to soul search over its own role in the hacked election. We also discuss the Facebook Files and Stamos’ objections to some of the latest reporting on Facebook.

Background reading:

Opinion: Indictment of Steele dossier source is more bad news for multiple media outlets

How Did So Much of the Media Get the Steele Dossier So Wrong?

Collusion? Who needs it when Facebook was allowing Russians to help Trump?



Get full access to Newcomer at www.newcomer.co/subscribe

00:00:06
Welcome. Hey everybody.

00:00:14
Welcome to this week's episode of dead cat.

00:00:16
This is Cam joined as always by Eric and Katie joined this week

00:00:21
by our very special, guest Alex Stamos, the director of the

00:00:24
Stanford internet Observatory, though, you may all know him for

00:00:27
his earlier work as the chief security officer.

00:00:30
Assert Facebook, where he oversaw Security on their

00:00:32
platforms at a time. That things got a little weird

00:00:36
over at Facebook spicen. Yes, by C.

00:00:40
So this week, we're going to go deep on Facebook and Russian

00:00:42
interference and more specifically if we can get

00:00:44
there. How the media reported on this

00:00:47
crazy brain breaking time that we probably will never recover

00:00:52
from before we get to that though.

00:00:53
Quick public congrats to Friend Of The Show.

00:00:55
Ken Griffin of Citadel Big Winners this week.

00:00:58
Dead cat sponsor chair. Auction to win.

00:01:00
Our in-house copy of the US. Constitution can beat out

00:01:04
constitutional Dao, better luck next time guys, and congrats to

00:01:08
Ken for winning. All right, Alex.

00:01:10
So this week we want to dive into a look back on Facebook and

00:01:14
Russia and the media. The reason it's interesting now

00:01:17
is that there was this recent arrest of I Gordon chanko an

00:01:21
analyst who was indicted in an investigation led by the doj

00:01:25
that's looking into all the Trump Russia 2016.

00:01:29
Ian, what have you? There's actually a great piece

00:01:32
about it from Eric when people in Washington Post that we can

00:01:34
link to in the show description. But to make this short, which is

00:01:37
never easy to do with Russia gate stuff.

00:01:39
This indictment essentially led to the collapse of the steel

00:01:42
dossier and a number of attractions for media

00:01:45
organizations but was never in doubt throughout all of the

00:01:49
investigations was that Russia did interfere in the election

00:01:53
and continues to interfere in elections as do many, other

00:01:56
countries around the world and there's probably no one who

00:02:00
Dance, that better than you. Alex, while at Facebook, you

00:02:02
were one of the first people, to grasp, really, how Russia use

00:02:06
the platform for their interference campaign.

00:02:09
And you work directly with the molar investigation to explain

00:02:14
that process. And then you got to see a

00:02:16
covered by the media. Basically, in real time poorly,

00:02:20
what's the score on? How is covered by the media in

00:02:23
real-time? I think that's what we're going

00:02:24
to discuss. Yeah, yeah.

00:02:26
Well, you know, Can you maybe walk us through?

00:02:31
What it was? You saw on Facebook's platform.

00:02:34
Yeah, we've got to get to check a launching Point here, the

00:02:36
weather. Yeah, what exactly did you see

00:02:38
while you're at Facebook? And how did you hear going to

00:02:40
cut right to the end? You guys gotta let me warm up

00:02:43
into the eggs. Like I don't want anybody to

00:02:44
actually know what happened. It's a special thing.

00:02:47
Nobody knows what happened. And it's all just hot.

00:02:50
Takes take first, smart blathers.

00:02:54
Yeah, maybe like, yeah, we will sit on the media center enough,

00:02:56
Eric, but we got to get a foundation here.

00:02:58
You can read all about it. Yeah, in an ugly truth by

00:03:02
Katie's colleagues which is soon to be a miniseries which means

00:03:06
all of my friends are to eating are texting me with who should

00:03:09
play me in the, the adaptation of what have you heard so far?

00:03:12
Can we get some casting rumors going?

00:03:13
Well, apparently exact Galifianakis is the current

00:03:16
leader, making a dramatic turn as a Surefire way to 1 win, an

00:03:19
award, but also reinvent your career, so it's it went out

00:03:22
every. I think it would be a great move

00:03:23
for him from a dramatic perspective.

00:03:24
I'm hoping for John Stamos, you know, even though he's like 10

00:03:27
years older, I feel like it's the same.

00:03:29
This on Stamos could be the name of your podcast.

00:03:32
When you talk about the right, I'll have to do a real-time

00:03:35
podcast with the episode recapture.

00:03:38
I was criticized for making the making this to light.

00:03:41
Now you guys are. Okay.

00:03:43
Kitty, ask a real question. Okay.

00:03:44
Okay. Okay, so tell us what you saw in

00:03:47
2016 so let's take. So look at the big picture.

00:03:50
There's like maybe four different themes of Russian

00:03:54
interference in the election, right?

00:03:55
And so the first is, when I'm not going to talk about, just to

00:03:58
mention is like all of the reach.

00:04:00
It's in relationship to the Trump campaign, right?

00:04:02
So, you know, all of that stuff of which the Steel dossiers part

00:04:05
of it. There's a lot of stuff in the

00:04:06
molar report and then the Senate Intel reports.

00:04:10
And so, there's the physical stuff right from a propaganda

00:04:12
perspective. Another part we won't really

00:04:14
talk that much about is, what is called, kind of white or gray

00:04:17
propaganda which is the propaganda that can be tied to a

00:04:21
government. And so that's like Russia Today,

00:04:22
Sputnik that's called White propaganda.

00:04:25
And then kind of the two darker more mysterious ways that they

00:04:29
influence. The election.

00:04:31
The first was the gru activity. So the gru is the main

00:04:34
intelligence, directorate of the Russian military.

00:04:37
The gru has existed for a long period of time.

00:04:39
It's Military Intelligence. They have a hacking side and

00:04:43
they have a propaganda side and the hacking and propaganda sides

00:04:45
work together. So this is what we see from kind

00:04:47
of the really high-end State adversaries is these kinds of

00:04:50
hybrid operations, where you have both a offensive computer

00:04:54
network intrusion capability, combined, with the propaganda

00:04:57
capability. And so, with the gru Was they

00:05:00
broke into the dnc's email server.

00:05:02
They broke into John, podesta is email, the email thousands of

00:05:04
other people. It turns out most of them didn't

00:05:06
have much to do with this but the campaign that they uncovered

00:05:10
Pizza gate, exactly right. We broke that thing out.

00:05:14
Eric's using this to spread disinformation.

00:05:16
Fortunately, if you have a whole podcast but this information

00:05:18
Spotify will give you a hundred million dollars.

00:05:20
So there may be this is the way for you guys to we're working on

00:05:27
it and so you have the gru activity T where they breaking

00:05:31
all that stuff, they steal all this information and then they

00:05:35
release it in a variety of different means, including via

00:05:38
Wikileaks, including via personas that they created

00:05:41
themselves. Their goal was to change the

00:05:43
conversation about Hillary Clinton, so that's the gru

00:05:46
activity and then there's the IRA activity.

00:05:49
So II RA in this context were talking about the Russian

00:05:51
internet research agency. That is a private group.

00:05:55
They belong to a guy named Ian vague deeper Goshen.

00:05:58
He is an was We considered an oligarch in Russia.

00:06:01
He has this troll Farm, which at the time was in this building

00:06:04
st. Petersburg, they've actually

00:06:06
moved since then, but this famous building in st.

00:06:08
Petersburg at which they would build propaganda and spread it

00:06:13
on social media, and they did that through the creation of

00:06:16
thousands of fake accounts of all of these personas that would

00:06:20
pretend to be from the countries where they spread it, and then

00:06:23
building up audiences and pushing it, part of that

00:06:25
includes advertising. And so, the internet research

00:06:28
agency were and about, Thousand dollars in ads on Facebook, the

00:06:32
ads were not really for the creation of the propaganda that

00:06:34
was for the creation of audiences.

00:06:35
So their goal was to use ads to Target groups that might be

00:06:40
interested in their content and then pull them into liking pages

00:06:43
and then the most of the content was pushed via Facebook pages.

00:06:47
So those are the identities for these groups.

00:06:49
They would create like a pro-immigration group and

00:06:51
anti-immigration group a pro black lives matter, group and

00:06:54
anti black lives matter group, stuff like that.

00:06:56
Their content in the end with something like 80 piece of

00:06:59
content. Seen in the end by over 130

00:07:02
million Americans. That was the number one topic

00:07:05
that they were talking about and you're at Facebook when this is

00:07:08
happening or give us where you sort of fit into this time,

00:07:13
right? So I was at Facebook, when this

00:07:14
happened I joined Facebook in 2015 and as Chief security

00:07:18
officer my primary job was keeping people from hacking

00:07:21
Facebook, right? So actually feel pretty good

00:07:23
about succeed at my piercing, right, right.

00:07:26
So we actually did a lot of really good work on security

00:07:28
stuff while I was there. And So that's like my primary

00:07:31
job. But then I also had a number of

00:07:32
groups that worked on the abuse of the platform to cause harm,

00:07:35
most specifically there. I had investigators that worked

00:07:38
on child. Safety, that worked on

00:07:40
counterterrorism. So we had the counterterrorism

00:07:42
investigators on my team and then we had to this threat

00:07:44
intelligence team. And so the threat intelligence

00:07:46
team is mostly X us and then a couple of other from Western,

00:07:51
Government intelligence analysts work to NSA, what the CIA in

00:07:54
such whose job. It was to track different

00:07:56
government's activity, both attacking Facebook.

00:07:59
But then also, In Facebook to do bad things.

00:08:01
And so kind of a normal work week for them would be, we found

00:08:05
four accounts that we can tie to the Islamic revolutionary guard

00:08:08
Corps and they are spearfishing, State Department employees to

00:08:13
try to get access to their Facebook accounts, from which

00:08:15
they can then trick, Iranian dissidents into revealing

00:08:19
themselves and then arrest them. So, that's like an actual thing

00:08:21
that we found was this big attack by the Iranians because

00:08:24
the state department where we had machine learning that

00:08:27
detected the kind of activity, we figured it all out.

00:08:29
We worked with the state department and the FBI to round

00:08:32
that all up in to stop it. And so, that was their normal

00:08:34
activity. But we got pulled, we got pulled

00:08:36
into this from two different ways.

00:08:37
So first, we had a dedicated team working on a PT 28, which

00:08:41
is the, the hacking team. That's tied to the Gru.

00:08:44
And we saw in the spring of 2016 during the primary process

00:08:49
activity by Gru that look like they were interested in people

00:08:52
who work for the DN C and D Triple C.

00:08:55
If you're an intelligence agent, you've been told I want to hack

00:08:59
somebody. Want to hack this organization

00:09:00
with the first things you do is you try to figure out all the

00:09:02
people that work there and then you learn everything you can

00:09:05
about them from their public social media profiles.

00:09:07
So, we had a bunch of accounts that we had previously

00:09:09
attributed to GRU because their activity.

00:09:11
So we saw them doing stuff in the Ukraine and we saw them

00:09:13
doing hacks against the world, anti-doping agency, your

00:09:16
Facebook, and Linkedin monetize that well or do you make exactly

00:09:21
right? Yeah, we get ya LinkedIn.

00:09:23
I mean Facebook doesn't sell accounts, but LinkedIn could

00:09:26
probably sell some Premium Accounts to the gru.

00:09:29
I To most people that reach out to me on LinkedIn are affiliated

00:09:32
with some sort of foreign State actor.

00:09:33
So, yeah, it's not, it's not a bad Theory.

00:09:36
And, and so we saw this activity, we saw them kind of

00:09:38
snooping around and so they weren't do any actual attacks on

00:09:40
Facebook, but what they were doing reconnaissance, which is

00:09:44
the first step in this. The standard operating procedure

00:09:47
from that point, was you go and you tell the FBI if they're

00:09:49
targeting Americans and the FBI handles it, we now know FBI

00:09:53
didn't really do much. Like they had all this

00:09:55
information about different kinds of targeting of DNC and

00:09:58
there's all this crazy stories. Like I mean, you know, the

00:10:01
Hoover building is whatever four blocks from the DNC headquarters

00:10:04
or something, and somehow the message didn't make it, but

00:10:06
okay. And then later in the year,

00:10:08
after the hacks happen, which don't happen on Facebook.

00:10:10
They happen on Gmail and then they happen directly against the

00:10:13
dnc's infrastructure, which the DNC should never been running

00:10:15
their own email server. But that's a different issue.

00:10:18
They come back and they create these personas on Facebook

00:10:21
called DC leaks. So we'll Wikileaks is kind of a

00:10:24
useful idiot to them but they didn't have direct control over

00:10:27
Wikileaks. And so, they pretty clearly

00:10:29
wanted To have personas to leak information that were under

00:10:32
their direct control. And so they create these

00:10:34
personas, and then they start reaching out to journalists here

00:10:37
are stolen documents. I have, these are the stories

00:10:38
you should write, right? And that is effective, it turns

00:10:41
out. Now, they also release the data

00:10:42
in a bunch of different ways, right?

00:10:44
And in the end, they get the stories written that they want.

00:10:47
Now, the internet research agency stuff, came after of, you

00:10:51
know in the whole discussion of fake news and such that we had a

00:10:53
big project to kind of answer the question of all of this fake

00:10:56
news content. How much of it came from Russia

00:10:58
now in the long term. Just a very small fraction of

00:11:01
all that the vast majority of that content is domestically

00:11:04
created or family. We suck ourselves that we're

00:11:07
just right, right? And or it's created by like

00:11:09
financially motivated actors like the apocryphal but real

00:11:13
Macedonian teenagers, right? Like things like that.

00:11:16
But you know we did obviously found a big chunk of it.

00:11:18
We found the internet researching to see activity and

00:11:19
then announced that later in 2017, but those are kind of two

00:11:22
different things. So, on the gru stuff, we were

00:11:24
ahead of the curve on the IRA stuff.

00:11:25
We were behind the curve. And you went to the molar team

00:11:29
to say, we Found what led to the I re-indictment you reached out

00:11:32
to Jeanne re on the team. You know, what was her response?

00:11:35
What was that first conversation?

00:11:36
Like yeah. So so there's an interesting

00:11:39
back and forth between the big tech companies and Department of

00:11:43
Justice on all kinds of crimes. In that we have this this is one

00:11:46
of the interesting things that we haven't really figured out.

00:11:48
Even since 2016 is, how do we want this kind of law

00:11:51
enforcement to happen on these companies because the truth is,

00:11:54
is that doj and FBI don't have the ability to find the things

00:11:57
that the companies can because they can't access.

00:11:59
Us petabytes, and petabytes of data without lawful process,

00:12:03
they have lots of power to get data on specific individuals,

00:12:06
either in a classified or unclassified setting, but they

00:12:08
have to know who to grab in the first place.

00:12:11
And so, this is actually a pretty common model of the

00:12:15
company's notice something bad happening.

00:12:17
You go and you send your lawyers to go brief, somebody from the

00:12:20
right us attorney's office, you write sign out an Affidavit of

00:12:24
this. Facebook employee has witnessed

00:12:25
a crime being committed, so it's structurally.

00:12:28
This is the kind of conversation that happens all the time.

00:12:29
I'm from a political perspective, this was very

00:12:33
different right in that, you're not talking about reaching out

00:12:35
to like just the normal us, attorney's office in Kansas or

00:12:38
whatever because you found a child molester there, you're

00:12:41
talking to the special counsel's office, right?

00:12:43
And it was interesting, I think in the end, I mean, they were

00:12:45
great and they were very thankful.

00:12:47
But yeah, we went and privately told them you personally or who

00:12:51
is no. No, the lawyers went.

00:12:52
So, I was part of preparing all this stuff, but like, in at that

00:12:55
kind of level, you're not sending somebody like me in

00:12:58
there, all the big tech companies have Of X doj lawyers,

00:13:02
who are in-house, who handle law enforcement relationships.

00:13:05
So we had a relationship between our threat Intel team and their

00:13:08
counterparts and doj. So, for these big groups like

00:13:11
apt 28, the FBI actually has coordinator sitting in the

00:13:15
Hoover building because, you know, the FBI's like a

00:13:17
franchise, right? Just like McDonald's.

00:13:19
There's some of them that have, like, great play places and some

00:13:21
have the really clean bathrooms. So, like some FBI offices are

00:13:25
really good at doing cyber stuff.

00:13:26
Some are not so great, so you'll end up with 5 or 6 other Offices

00:13:30
looking at a campaign separately.

00:13:32
And so the FBI is little have coordinators who sit in the

00:13:34
headquarters whose job it is to watch all of this data being

00:13:37
gathered and say, oh, these four different hacks, against

00:13:40
different steel companies. In four different jurisdictions,

00:13:44
are actually the same actor right there.

00:13:46
So, we had Direct relationships, those those folks.

00:13:49
But when you get to this level in your talking, the special

00:13:50
counsel, it's like you're sending the lawyer.

00:13:51
So the lawyers go and brief her and them and then we send our

00:13:54
people to go talk. After we got lawful process to

00:13:57
go, fill them in and to give them all the data that we So

00:14:00
this investigation by the special counsel was probably the

00:14:04
most watched and reported on Investigation, maybe in my

00:14:08
lifetime as your team is speaking to members of molars

00:14:12
team. How did you sort of view the way

00:14:14
the media covered that? I mean, did you find that there

00:14:17
was fairly restrained and reasonable articles written

00:14:20
about it? I mean, what's the

00:14:21
sensationalism already surrendered?

00:14:23
Because we can talk about what happens after the fact because

00:14:25
that's when shit really gets crazy but even just in the

00:14:27
lead-up to the molar investigation, What was your

00:14:30
perspective from where you said right?

00:14:32
I mean, I think one, so we published a white paper, I think

00:14:37
it was April or something. So in 2017, in Spring of 2017,

00:14:41
we publish a paper on what the GRE you did.

00:14:44
This is the paper that famously doesn't say it's Russia, but has

00:14:46
a footnote that hints that it's Russia and there's almost no

00:14:50
coverage of it, right? Like we did media briefings, we

00:14:52
did this in that and there's only a couple of stories, right?

00:14:54
A hint that it's Russia but not. Not actually have it be Russia.

00:14:57
You just say a country and what All-Stars?

00:14:59
Backwards or something. So I can pull up the exact

00:15:02
footnote. But it says something along the

00:15:03
lines of our data is consistent with the attribution provided by

00:15:06
the US intelligence agencies in their January memo.

00:15:09
So if you remember like, before Obama goes, there's a joint

00:15:12
communique from the intelligence agencies, saying it was Russia.

00:15:15
Right? Like was clearly the

00:15:16
administration getting anywhere. You're worried about getting

00:15:18
sued for libel or why the game. No, I mean, so that was a big

00:15:21
political fight internally which I'm sure will be an entire

00:15:24
episode of this miniseries with Zach Galifianakis being yelled

00:15:27
at by by the queen. I was part of Facebook.

00:15:29
Except her to not get as much news.

00:15:31
So, Facebook, explain any other not the lack of news coverage?

00:15:34
I think it's, I mean, there was a bunch of different things, but

00:15:37
there's comfy people who did not want Facebook volunteering that

00:15:40
bad things. Had happened that Facebook had

00:15:41
any part of this, which was clearly not going to work,

00:15:44
right? Yeah.

00:15:45
Thatís a, and it worked perfectly and no one's blame

00:15:46
Facebook since, right? Exactly.

00:15:48
And so you have to calm people were pushing that, but then you

00:15:50
also have like the d.c. government Affairs.

00:15:52
People who don't want to take a side, right?

00:15:55
And so, I think that was one of the big things is like, if you

00:15:57
say Russia, you know, you got those new Trump Administration,

00:15:59
Didn't the right thing to do is to turn this over to the special

00:16:02
counsel and eternal all the data over to the people who are

00:16:05
investigating. And our job is not to get

00:16:07
involved in the public fight, which just happened to coincide

00:16:10
with the, like, coms goals of the company.

00:16:12
And not yet like involved in anything, right?

00:16:15
So, yes, there's a big fight. Like it was a lot, just to get

00:16:18
it out. Like the fact that we admitted

00:16:19
that anything happened, but everything in there is accurate,

00:16:22
just like the whole Rush attribution section was cut out

00:16:25
as in one of 85 different versions that went out, but it

00:16:29
got very little Coverage. Right?

00:16:30
And there's anything we've learned from the whole mother

00:16:32
thing. It is that the rollout of stuff

00:16:34
matters. Almost more than sort of all the

00:16:37
technical footnoted. It's not like, oh, the public

00:16:40
eventually consumes it the smartest way as long as it's in

00:16:44
there. It's that the the fucking

00:16:46
headline is the most important thing and you can say, oh is in

00:16:49
there, I don't know. That's that's no.

00:16:51
I think you're right. I think you're right and

00:16:53
certainly in so it didn't cover the much.

00:16:55
I mean, looking back it's not that great.

00:16:58
I think like the treatment of Facebook has to Seen in the

00:17:00
context of the overall coverage of trump in Russia, which I

00:17:04
think, you know, and Smith came up with this term resistance

00:17:07
journalism and I think there's a lot of that going on right in

00:17:09
that. There's a bit of a moral panic

00:17:12
and as long as you say stuff that is about the targets,

00:17:15
there's not a lot of pressure from other journalists to get it

00:17:18
totally right. And so there's a lot of kind of

00:17:19
logical leaping and I think that happened both before and after

00:17:23
our announcement September of 2017, obviously after we

00:17:26
announce people kind of lose their minds what we'll get to

00:17:28
that in a bit because II actually.

00:17:29
Yeah. We'll get to that in a bit

00:17:30
because I pulled some, some clips from that era which was a

00:17:33
great thing. I spent the, I spent like two

00:17:36
hours last night, rolling through articles.

00:17:38
We'll get to that in a because I definitely want your things.

00:17:40
So, so anyway, so you guys work with the council and it is being

00:17:44
reported on simultaneously, what will we briefed molar way before

00:17:47
the Public Announcement, right? And in fact, when things get

00:17:50
complicated, the Public Announcement is that we had a

00:17:52
gag order from the molar team, right?

00:17:53
So like we go and tell them and then they hit us with a gag to

00:17:57
not talk about any of this. And so everything that Facebook

00:17:59
Okay, announced had to be negotiated with the special

00:18:01
counsel's office, which turns into this whole kind of

00:18:04
nightmare when we do the announcement, it doesn't come

00:18:07
with any of the data and that was specifically due to this

00:18:10
order from Miller's office. So you know and I'm not a lawyer

00:18:13
but like these lawyers were having some pretty honest and

00:18:17
big arguments both with molars team and then the lawyers in

00:18:21
Congress and side like you said, Eric the rollout is what

00:18:24
matters. And so you know Facebook ends up

00:18:26
putting out this mealy-mouthed blog post with my name on it.

00:18:29
That had pretty Much nothing to do with the first draft that I

00:18:31
wrote like every single word was changed by a lawyer.

00:18:34
A comes first. Before when I say every reporter

00:18:36
would tell you you can't let people put stuff on your byline.

00:18:39
The you know yes, I learned that too late.

00:18:42
Yeah, I should not have let them get my name on it.

00:18:44
Really you regret the whole thing?

00:18:46
Like I mean isn't it better something than nothing?

00:18:49
I don't regret the white paper I think getting the white paper

00:18:51
out. So on the white paper side I

00:18:54
wish it said Russia. This is the other part.

00:18:56
So when you talk about the treatment of the media, you

00:18:58
know, again there's two totally different.

00:18:59
Campaigns the IRA campaign. Absolutely.

00:19:02
The responsibility of Facebook and Twitter, right?

00:19:04
It is our responsibility to catch that and we did not my

00:19:06
team did not catch that. I did not catch that you didn't

00:19:08
stop it. The gru campaign was targeted at

00:19:12
the media and that absolutely worked, right?

00:19:15
I wrote. There's a there was a story

00:19:17
about John podesta as pasta sauce recipe, which one could

00:19:19
argue probably wasn't really relevant to the election issues

00:19:22
at hand. And yet, there was political ran

00:19:24
a live blog at start. We all know we all know that he

00:19:26
had risotto recipes, or whatever.

00:19:28
Yeah, I mean, yeah, like Luke. I ran a live blog of, like, what

00:19:31
embarrassing stuff can we find in these emails?

00:19:33
And maybe it's a Russian plot, but doesn't really matter

00:19:35
because it's newsworthy. This is where I spent most of my

00:19:37
evening yesterday reading through old articles because as

00:19:40
Facebook, gets closer to, I guess it was just delivering

00:19:44
testimony front of Congress about what you found in your

00:19:47
research on the IRA. There's kind of leaks that come

00:19:50
out did the day before hand and so just pulling through a

00:19:53
sampling of Articles, you know, this one from the New York Times

00:19:56
actually October 2nd 2017 headline Facebook's Russia link

00:20:00
ads came in many disguises and, you know, has a lead in there.

00:20:04
The Russians, who poses Americans on Facebook last year

00:20:07
tried on quite an array of disguises.

00:20:09
There's an article in CNN from Dylan Byers, September 28th,

00:20:13
2017 that refers to the IRA as a shadowy agency, which comes up

00:20:18
very frequently, describing them as a shadowy agency that's

00:20:21
extremely sophisticated in their ways of disseminating.

00:20:25
They're not necessary to visit of information things of that

00:20:28
nature. There's also So, a column from

00:20:30
Margaret Sullivan, I want to get to later.

00:20:32
But again, from your perspective, as these articles

00:20:34
are coming out in advance of Facebook employees testimony in

00:20:38
front of Congress. What's your sense as this is

00:20:40
being rolled out? Because it definitely fed a

00:20:42
frenzy happening in the public that.

00:20:45
Russia fucked everything up. They came in there.

00:20:48
They broke American democracy and they did it through to visit

00:20:52
Facebook posts. Yeah, I mean, I think so.

00:20:54
Facebook obviously takes a huge should take a huge amount of

00:20:57
blame on the way that all this information.

00:20:59
Was released and framed and the desire to keep everything

00:21:03
secret. And then to only allow things to

00:21:04
come out drips and drabs really hurt the company.

00:21:07
That being said, the overall reaction seemed crazy because it

00:21:10
was completely out of touch with the actual quantitative size of

00:21:15
this content because something like 80 post sounds big

00:21:19
Until you realize the denominator on that is in the

00:21:22
hundreds of millions, right? And so the vast majority, this

00:21:25
content had almost no reach like on the stuff where the Russians

00:21:28
posted the content themselves. All right, so II ra, they're

00:21:31
making propaganda, they're posting it.

00:21:32
So this is like a picture of people at a fence that is an

00:21:36
anti-immigration add or it's a pro.

00:21:39
Something that looks like it's Pro black lives matter.

00:21:41
But made by a fake group, right? For that stuff, its overall

00:21:45
reach compared to other, both organic, and paid content was

00:21:49
nowhere near, right? There's hundreds of millions of

00:21:52
dollars were spent on the election on Facebook and they

00:21:54
spent 150 dollars, right? Which it's like over the course

00:21:56
of two years, which do the math. That's what less than 5.

00:21:59
I was in a month. I mean like there are they are

00:22:01
like, you know, d2c makeup brands for dogs, that probably

00:22:05
spend more on Facebook and Instagram than the IRA did.

00:22:09
So the idea that this was a huge spend was was, was always a

00:22:12
fallacy and I don't know how many people that, you know, got

00:22:16
whipped up into a frenzy, about it, really understood that.

00:22:19
Well, because it also got combined completely without any

00:22:22
evidence with the Cambridge analytic issues, which is this

00:22:25
kind of imaginary world in which Cambridge analytical is this

00:22:28
magical Group of bond villains that have mind-control powers

00:22:32
based upon, right? Yo, people's likes that they

00:22:35
took yo, illegally via an API to be totally clear Cambridge

00:22:39
analytical themselves as a scam, right?

00:22:41
So they said they had these magical psychographic models of

00:22:43
people and there are quantitative metrics as to

00:22:46
whether online ads work. There's a whole set of people

00:22:50
whose entire job. It is to think about these

00:22:52
metrics and I know of those met none of those metrics is came a

00:22:54
general like a better than just any other kinds of AD targeting

00:22:58
that people do on Facebook, right?

00:22:59
And they had nothing to do with Ira to be clear, Kim Joon wook.

00:23:02
It had absolutely nothing to do with internet, research to see

00:23:04
the research internet research agency did zero data upload

00:23:07
custom audiences. So they didn't even do what we

00:23:11
consider micro-targeting. They targeted like you know,

00:23:14
young men in this ZIP code in Baltimore for pro black lives

00:23:19
matter messages, but none of that mattered to the narrative

00:23:22
which conflating these things at these Russians have kind of

00:23:25
magical mind control powers that they were able to use through a

00:23:28
hundred fifty thousand dollars of ads.

00:23:29
When Hillary Clinton campaign, spent like a hundred million

00:23:31
dollars, right answers? Like the idea that they were a

00:23:34
thousand times. But what about this idea that

00:23:36
you guys were like hand holding the Trump campaign in a way, you

00:23:39
weren't the cloaking campaign. So there's I think totally

00:23:43
legitimate arguments on whether Facebook ads had an effect on

00:23:47
the campaign. Almost certainly Facebook ads

00:23:49
had an effect on the campaign but they were to the

00:23:51
quote-unquote legitimate ads paid for by the campaigns and

00:23:55
the official committees and such using the money that our country

00:23:58
allows people. To give, right?

00:24:00
Yo, to two candidates in the parties, for which there are no

00:24:04
legal controls, I had nothing to do this.

00:24:05
So all of this I know from the investigation afterwards, right?

00:24:08
But, my understanding is that both campaigns, just like any of

00:24:11
the, if you're a Procter & Gamble, you get a personal, you

00:24:14
get a Facebook employee whose entire job it is to help your

00:24:16
campaigns run. And so these campaigns are

00:24:19
spending tens hundreds millions of dollars.

00:24:21
So they got offers of these people will help you do it and

00:24:24
the Trump campaign not knowing what they're doing said, yes.

00:24:27
And the Clean Campaign was like now we At this and said no.

00:24:31
And so it is true and it is quite possible that trumps

00:24:33
campaign was way better than Clinton's campaign because of

00:24:36
that, we partially don't have this data because the data is

00:24:39
kind of Legally locked up and it's never come out and there's

00:24:42
never been any changes to the law to allow it to come out.

00:24:44
I'm actually think this is where Congress has failed.

00:24:46
The most is that there have been zero changes to online

00:24:49
advertising law, since 2016. I would like to ban

00:24:52
micro-targeting for political ads, because I think it's

00:24:55
innately corrupting because it both drives campaigns just All

00:24:59
this information but it also drives them to Target message

00:25:03
different messages to different audiences.

00:25:05
Both political parties. Do this in both political

00:25:06
parties, think they're better. So they will make all these

00:25:09
noises publicly about how horrible ads are incorrigible

00:25:11
lake or river. It's all Bs right?

00:25:13
In private, all of these senators and congressmen are

00:25:16
being told by their Consultants were getting paid to run

00:25:19
political ads. They're being told by their

00:25:20
Consultants were better, do not unilaterally disarm, and so they

00:25:24
won't pass any laws. Europe.

00:25:25
Europe is starting to make some moves here, but in the United

00:25:27
States, there's been nothing Kim.

00:25:28
Can we take a quick Step back for a second to to the IRA stuff

00:25:32
because I mean you sort of brought it up and I have this

00:25:34
column here. This is again for that.

00:25:36
Same time period in 2017, this is from Margaret Sullivan, who

00:25:39
is a great media columnist. This is not a slam on her at

00:25:41
all, but I think it looks very interesting in retrospect and

00:25:44
really is a good summary of the era.

00:25:47
So the headline is collusion, who needs it?

00:25:49
When Facebook was allowing Russians to help Trump.

00:25:52
This is a key paragraph from this column much of that

00:25:54
content. This is the IRAs content was

00:25:56
expressly designed to widen the cultural divides, the United

00:25:59
States to Of wedges among its citizens and in doing so, to

00:26:02
help elect the Russian, government's preferred candidate

00:26:04
Donald Trump. It worked.

00:26:07
And while those more recent numbers are astonishing, I'm

00:26:09
assuming this is the numbers. Unlike reach the reality is

00:26:11
probably far worse research from Columbia University's Tau Center

00:26:15
suggests that Russian, linked information or disinformation

00:26:18
with shared hundreds of millions of times on Facebook.

00:26:20
The numbers Boggle, the mind of what here.

00:26:22
No, let me, let me read the key paragraph here because this is

00:26:24
where I think it really gets to the point.

00:26:26
We need to admit the obvious. If there had been no Facebook

00:26:29
spreading, Russia propaganda. There might as well be no

00:26:31
President Trump. Now looking at this.

00:26:35
Now from 2021, I don't want to seem too much smarter now

00:26:38
because it's easy to be critical in hindsight.

00:26:40
But is that an insane conclusion to reach based on everything

00:26:44
that you saw from the IRA? And is the fact that a major

00:26:47
newspaper is publishing this, in some way, a failure on the part

00:26:51
of the media, or the interaction between the media and these tech

00:26:54
companies to accurately, assess the role that the I replayed in

00:26:58
the 2016 election. Yes, that's wrong from my

00:27:01
perspective, the two kinds of online propaganda that were most

00:27:04
effective was one legitimate Facebook ads.

00:27:07
The ads that were run by the candidates in, which Trump did a

00:27:10
better job than the Clinton headed.

00:27:11
Straight up above board, using custom audiences and all the

00:27:14
Procter & Gamble shit. Which again, I would like laws

00:27:17
to change that but we need laws to change it, right, right.

00:27:19
And then the second was the gru activity because the gru

00:27:22
activity completely changed the tenor of the coverage of Hillary

00:27:26
Clinton, that was incredibly effective because what they used

00:27:29
was. Olin emails that allowed them to

00:27:32
create stories that were based upon a kernel of Truth.

00:27:35
Where the story itself was not accurate and be frank.

00:27:38
The US media has never ever ever looked inside of itself of the

00:27:43
fact that every major news are, you know, the New York Times The

00:27:45
Washington Post, CNN MSNBC, not just Fox, which like, we'll just

00:27:48
have to write off Fox of like what we can do about Fox.

00:27:52
But for all the, what I considered the legitimate media,

00:27:54
every single one of them was played by the gru they wrote

00:27:57
exactly the stories of Gru wanted and they Done.

00:28:00
No soul-searching on that at all.

00:28:03
And so at, yes, we screwed up like I will absolutely admit

00:28:06
that we screwed up but the idea that a hundred thousand dollars

00:28:09
in ads is as important as the entire legitimate, American

00:28:13
Media changing the way it covered Hillary Clinton is just

00:28:17
ridiculous. I want to just to say Tom

00:28:21
response to your comment and then it was must Alex in a

00:28:23
minute. That Margaret sullavan column is

00:28:25
so interesting because it kind of erases history, you know,

00:28:29
Russia 'He had used propaganda to widen the divisions between

00:28:34
Americans before there were very active in the 60s and 70s the

00:28:37
United States when the country was going through an

00:28:39
extraordinary Civil Rights Movement.

00:28:42
This is not the first time that Russia and other countries but

00:28:45
especially Russia have used internal divisions in a country

00:28:48
to try to destabilise the country itself.

00:28:50
So I think that, what is missed in that column is that this is

00:28:53
not a new strategy, it's simply a new tool and I think we could

00:28:57
debate whether or not the tool is more.

00:28:59
Active than any other tool they've had before and that is

00:29:03
an interesting debate but certainly this is not new.

00:29:05
I mean, if you're a student of history and you've read anything

00:29:08
about Russia's efforts to cause the collapse of the United

00:29:13
States in the 60s, you know, this isn't new.

00:29:16
Yeah. Tom Tom ridge book on this is

00:29:18
fantastic right. Active measures like absolutely

00:29:22
CIA invented AIDs that whole thing.

00:29:24
Yes, I think this is a good because we're now promoted

00:29:26
Cecilia and share has book and soon-to-be TV show.

00:29:29
Ow. And now, this really great book

00:29:30
after measures were really more of kind of like a marketing, a

00:29:34
marketing agency owner. Do you guys have a code that

00:29:37
unites discount on these books? You need to you need to line

00:29:40
that up before this goes. Yeah, but I think that on the

00:29:42
question of whether or not newsrooms have really reckoned

00:29:46
with this idea of being played by the gru, there were moments

00:29:50
where at least Dean baquet at the times, he came out different

00:29:55
panels. He was asked about this and he

00:29:56
did say that it had led to some soul-searching.

00:29:59
I You know, the internally certainly this conversation

00:30:03
husband had an is being had and those are really important and

00:30:07
ongoing conversations, especially we saw going into the

00:30:10
2020 election, will see you again going to the 2022.

00:30:15
Midterms like this, this is an ongoing conversation.

00:30:18
Even if it's not necessarily happening as publicly as it is

00:30:21
for Facebook and whether or not that's fair, I think is very is

00:30:24
a very good question, really? I mean, there are there are

00:30:28
great journalists at All of these organizations who I've had

00:30:30
private conversations with where they absolutely recognize all

00:30:33
this. And I know that they are part of

00:30:35
the internal conversations of. Like, let's not get played again

00:30:37
on D McKay, I remember very distinctly a week because I

00:30:40
wrote something about this, where Mark Zuckerberg gave his,

00:30:43
like, the speech were like, he changes history.

00:30:45
That Facebook was invented because of the Iraq War, which

00:30:47
is, yeah, yeah. I was not in that Harbor Group

00:30:49
which is why I don't own an island, but like, that does not

00:30:52
seem accurate to me, but the same week Dean Beck a was on

00:30:56
Michael Voris podcast, which I'm a huge fan of the times podcast.

00:30:59
And borrow to his credit is like interviewing his boss's boss's

00:31:04
boss, and ask them. These really tough questions.

00:31:06
And Bouquet says, if it's newsworthy, we will publish it,

00:31:10
right? And so I saw both Zach and Decay

00:31:12
or kind of actually to me similar figures in that from

00:31:16
bekay everything is about newsworthiness right?

00:31:19
And even if you're getting played, even if the leaked

00:31:22
documents were leaked from hacking from a Russian hacking

00:31:25
group, they're going to run. It whereas duck was all about

00:31:28
kind of free speech. In the freedom of individuals

00:31:30
and I saw them as like in this parallel that both of them are

00:31:32
getting played, both of them have like this deep belief in

00:31:35
what they're doing. And can't understand how that

00:31:37
deep belief is being used against them, so they Facebook,

00:31:39
like we at least published I? Yes, I failed to get the company

00:31:43
to put Russia into that paper. The New York Times wrote nothing

00:31:47
right. The York Times has never ever

00:31:50
written anything. About we got played by the

00:31:51
Jerry, you were sorry. And so yes, did I fail?

00:31:54
Yes, but I, at least will say it.

00:31:56
We need to realize, we need to rely on media columnist for

00:31:59
that. We need to rely on Margaret

00:32:00
Sullivan for those sorts of Articles anymore.

00:32:03
But Margaret Sullivan is as subject to the, you know, the

00:32:06
whims of the moment. And that means like in general,

00:32:09
right? Sure.

00:32:09
It's really hard to get the media to write a story saying,

00:32:12
we messed up. And I think the red and many

00:32:14
moments where various people have said, don't you think that

00:32:18
BuzzFeed should apologize for publishing raw unvetted

00:32:23
intelligence that completely changed the course of how we

00:32:25
think about Donald Trump, that's falling apart.

00:32:27
These questions have come up again.

00:32:28
And again, Multiple media organizations and I'm not

00:32:31
excusing them. I'm just going to agree with you

00:32:33
that we haven't seen the equivalent of a white paper

00:32:35
where a media organization comes out.

00:32:37
And says, this is, this is how we messed up and this is what we

00:32:40
want to do better. So I think, because of the way

00:32:42
the media works, we are all really dependent on columnist.

00:32:45
Whether it's a Margaret sullavan or whether it's a Columbia,

00:32:47
journalism review, some sort of outside forcing function to say

00:32:52
this is important to recognize for sure.

00:32:55
For sure. I mean the handling of like the

00:32:57
hunter by knee males Joe His daughters diary.

00:33:00
I mean those would seem to be early signs that the media is

00:33:05
having a much more command and control approach which is also

00:33:09
criticized. Well I mean it is sort of a

00:33:10
no-win case where if there's information out there that seems

00:33:14
it's absolutely no win and it's no win for both the tech

00:33:18
companies in the media. I mean, there's no direct hit

00:33:20
both sides screwed up right and sort of the government.

00:33:23
So, like you're asking about what is like, so what did it

00:33:26
feel? Like for us inside Facebook,

00:33:27
what it felt like was there. Three?

00:33:29
Three people, there's Mark Zuckerberg in the hoodie,

00:33:31
there's, you know, the FBI G-Man in his suit and there's, you

00:33:36
know, a New York Times Reporter with like the Press hat on like

00:33:38
the little, you know, the Fedora or whatever and both the

00:33:41
government guy in the Press car. Like, oh man, that guy in the

00:33:44
hoodie screwed up, right? We did nothing wrong, but they

00:33:47
really messed up. All these things were in bed,

00:33:49
Facebook are accurate and true. And it is the mistakes of the

00:33:52
company that need to be fixed, but they are in the context of

00:33:55
an overall, failure of American society to deal with this new Of

00:34:00
attack by one of our adversaries.

00:34:02
And unless you look at the big picture, you can't solve it.

00:34:05
And that's like one of my lessons come out of the 2020.

00:34:07
Election was the context of propaganda.

00:34:10
Mind is completely changed in the vast majority of it.

00:34:12
Now comes from verified American voices.

00:34:14
Tuna, we know exactly what they are.

00:34:16
They're not being Amplified by Russians are not fake accounts

00:34:19
and they have the ability to control the media context as

00:34:22
well as to get their messages out on social media.

00:34:24
And until we kind of deal with the fact that this is a

00:34:27
multimedia issue, That that it flows between kind of the

00:34:31
traditional media and the online platforms.

00:34:34
We're not really gonna be able to do anything.

00:34:35
I don't think any of us would make this case.

00:34:37
But some you could say that it was in the media's interest to

00:34:40
put the blame on Facebook to shift the blame from the media.

00:34:44
If the bigger hack was in retrospect about using the media

00:34:48
to spread disinformation pointing the finger at Facebook,

00:34:52
was in the media, is like Financial corporate interest.

00:34:56
I mean, obviously I disagree with that.

00:34:59
Really, really the only reason I disagree with that is because

00:35:02
you're talking about a period when the media was so beloved

00:35:06
that nobody was shifting blame. Well, bye.

00:35:08
If you want to Kratz. So loved by the left, if you

00:35:11
were walking down the street as a Washington reporter, you went

00:35:16
to an airport and somebody recognized in Washington, not

00:35:20
just in Washington like all over the country.

00:35:22
People were thinking you were getting letters.

00:35:24
This. I'm talking about the period

00:35:26
right after Trump was elected. Oh, I know what you're talking

00:35:28
about. Look my mom.

00:35:29
I signed up for a subscription to The Washington Post just to

00:35:31
support them. She's like right now.

00:35:32
I'm no, I'm not. I'm talking about the Trump, the

00:35:35
Trump term of his presidency. I mean, people are wearing

00:35:38
democracy dies in darkness shirts, and DC.

00:35:40
Yeah, there is no blame to shift who was blaming the media for

00:35:45
Trump at that point. Now, people are now might be the

00:35:48
time when that thesis would make sense, but literally in the days

00:35:52
months after the election, basically, from the time, Donald

00:35:57
Trump was elected through the issuance of Of the molar report.

00:36:00
They'll nobody was blaming the media for anything, just like

00:36:03
nobody is blaming the FBI another group with its hated but

00:36:06
suddenly was not hated for like a couple of years.

00:36:09
I just want to ask what this social psychological story for

00:36:11
the media is if it's not that sort of narrative, like this is

00:36:15
a good way to deflect it. Like what is your mental model?

00:36:18
Is it just like the media is not top down enough and there isn't

00:36:21
enough like command-and-control, it's like sort of that it's an

00:36:24
organ or just like do you have a thesis on like where the media

00:36:28
fails on this? Stuff.

00:36:30
Well, I think Ben Smith was right about resistance

00:36:32
journalism, right? Which is like we end up in a

00:36:34
situation where as long as you publish something that fits

00:36:38
preconceived notions about, who the bad guy is like, if you can

00:36:40
tell something in the in the Arc of this is this is the bad guy

00:36:44
and they did bad things then getting the details, right?

00:36:47
Isn't that important, right? Like, there's not push back and,

00:36:50
and we see that with Facebook all the time, we're seeing that

00:36:51
now with the Hagen documents, we're about to the media reports

00:36:54
about those documents, do not match up to the documents

00:36:57
themselves. Once they are released which is

00:36:58
one of the reasons. I am quite upset that this is

00:37:01
the way. These documents were released is

00:37:02
like in a way to create kind of a Feeding Frenzy only for the

00:37:07
headlines. There's a bunch of stuff in

00:37:09
there. That's going to be super

00:37:10
important for academic, study of social media.

00:37:13
In the long run. This is going to be great

00:37:14
because this is going to kick off, kind of a much more

00:37:17
rigorous. Empirical study of the things

00:37:19
that happened on line that is going to see a lot of great

00:37:21
work. But in the meantime, we have to

00:37:23
deal with the headline grabbing of like the Wall Street Journal,

00:37:26
writing the story about the Instagram slides that did not

00:37:29
match. If the slides once the slides

00:37:30
came out, the Washington Post had a headline about Facebook.

00:37:33
Uprating, the, you know, the angry Emoji, which was

00:37:37
intentionally misleading in that in the story.

00:37:39
If you read it, all of the Emojis were upright, all the

00:37:43
reactions other than, like, were up ranked and then, and then

00:37:45
there is an internal study and that turned out to be causing

00:37:47
harm. So they fixed it, which is

00:37:49
exactly what you want. People who wrote the read The

00:37:51
Washington Post story where, less educated about how faithful

00:37:54
camels these issues, than before they read the story.

00:37:57
And so that is resistance journalism to me, because Nobody

00:37:59
pushed back on them, the Wall Street Journal.

00:38:01
The Instagram one is been, I mean we've talked about a lot.

00:38:03
What's your case? You just quickly articulate.

00:38:06
Now your objection though. So there's a couple things.

00:38:08
One, this is like very initial work by an Instagram team to try

00:38:12
to ask people how they feel in different circumstances to then

00:38:15
lead to more research into your talking about a very small

00:38:17
sample size. And this kind of just asking

00:38:20
people how they feel is not that great, and it's known.

00:38:23
And then the second is, there's Parts where it's much more

00:38:25
positive, and it's great makes people really happy, right?

00:38:28
When you read this, you know, Okay.

00:38:30
It turns out that, yes, there are problems with teenagers and

00:38:32
social media and that is something that needs to be

00:38:34
studied. And this is the beginning of

00:38:36
that. The best part of this was like,

00:38:38
there was an article in Wired Magazine, which is owned by

00:38:41
condé Nast, who are the people who publish Vogue, which is like

00:38:44
to get lectured by condé Nast on, you know, beauty standards

00:38:48
for teenage girls is is amazing and just kind of encapsulate.

00:38:52
The overall problem here, which is, there is a problem, both in

00:38:54
social media, and media, in a bunch of these areas, but only

00:38:57
one of those sides ever gets looked at in the media, never

00:38:58
looks them. Condé Nast will never ask

00:39:00
itself, have generations of our publishing, but they're really

00:39:04
obviously doesn't work that way. A reporter at wired isn't even

00:39:07
connected, to what the editor-in-chief of wired things.

00:39:09
Let alone with somebody it voted but nobody writes it.

00:39:12
Nobody will write any of these things in the bigger context.

00:39:14
No, but I understand the argument in part, because the

00:39:16
media companies themselves aren't doing those studies to.

00:39:19
So, we're in an unusual situation, we're probably for

00:39:21
the first time. We have studies about how media

00:39:24
products. If we want to call, you know,

00:39:26
Instagram and meteor product. We want to call copy of Vogue

00:39:30
and media product, how they're actually making people feel, you

00:39:32
know, Vogue does reader surveys all Publications do reader

00:39:36
research but it's really mostly about subscription.

00:39:39
How did this convert a subscription or did this not

00:39:42
convert a subscription? What are our most loyal readers

00:39:44
do? And how do we get more of them?

00:39:47
I think to your point about why the Facebook research will

00:39:50
continue to be important beyond the stories is that we'd have

00:39:53
never seen really this much research done on media products

00:39:57
before and I think would be great.

00:39:59
More was done. I think it'd be great if the

00:40:02
media itself, didn't more of that kind of research.

00:40:04
But right now, it does 0 of it. So, right.

00:40:07
Well, which is why the big irony series, it's coming out the

00:40:09
Wall, Street Journal, a Murdoch property, and the idea of Rupert

00:40:11
Murdoch, having like a Civic Integrity team looking at his

00:40:14
impact on Democracy is kind of hilarious.

00:40:16
I sort of realized that, you know, when Ben wrote that column

00:40:19
and by the way, it must be said that.

00:40:20
If you want to talk about the various purveyors of resistance

00:40:23
journalism, the release of the steel, dossier could easily fall

00:40:26
within that category, that was Ben's decision.

00:40:29
It's amazing. Yeah it was and look he can

00:40:31
defend himself and I'm sure he will on any has several times.

00:40:34
So he has, yeah, there's no reason to call him out.

00:40:36
He's a friend of the show but it absolutely was friend of my,

00:40:40
friend of up, he was a great cast and we like, then it was a

00:40:44
great guy love, but I started fighting people Eric, you are

00:40:49
more like been than any of the other people on this show.

00:40:52
And this is why you can't get all of those Farm on side

00:40:56
Channel. My Discord thread.

00:40:57
I write, I wrote it like, multi, A screed defending Ben Smith

00:41:01
owning shares. I've written on Twitter.

00:41:03
I'm I'm the most Ben Smith Toady around.

00:41:05
I agree with Ben Smith on the steel dossier.

00:41:08
The only reason you're the only reason you're ambivalent is

00:41:10
because you guys are the same person.

00:41:12
I'm just saying. Yeah.

00:41:13
They if you playing the banker is more fun.

00:41:15
If he's a friend of me then he's just a blind friend.

00:41:18
Yeah. Weirdly Eric was not there that

00:41:19
week, we interviewed been. So you make your own decisions

00:41:21
on what that? I know I missed out.

00:41:23
I've talked to many, many ways. Oh, I just mean you're the same

00:41:25
person, but that's not going to make a substitution.

00:41:29
Stack joke aren't you supposed to make some tax jokes like

00:41:32
every 20 minutes or something? And that required traditional

00:41:35
journalists. But no in my mind everything has

00:41:38
become resistant journalism, not everything.

00:41:40
A lot of things have become resistant journalism now because

00:41:44
I think people recognize that as subscriptions have driven, you

00:41:48
know, to be the predominant, business model of many things of

00:41:51
many of these news organizations, there is real

00:41:53
need to serve that audience. Do you think that that has

00:41:56
gotten to be more and more the case, you know, post, Teen

00:42:00
post-2020. And if you agree with that Alex,

00:42:03
what role do you think? The tech companies and Facebook

00:42:06
specifically play in in that being the way things are?

00:42:10
I mean, yeah. So the Dynamics of like the

00:42:13
change way from the advertising model towards subscriptions that

00:42:16
is obviously being driven by the tech companies taking up so much

00:42:19
revenue and I think, and I've said this probably for I think a

00:42:21
huge failure by the tech companies was never to figure

00:42:24
out a way to revenue share with legitimate media, whatever you

00:42:27
call legitimate media, right? Like a bunch of The stuff has

00:42:29
happened and it happened kind of too late.

00:42:31
After media was hollowed out, especially local.

00:42:34
I saw some stats of like, 10% of people who work in journalism

00:42:37
work for the New York Times. Now, it's like four percent of

00:42:39
the actual reporters and 10% of people overall.

00:42:43
Officially Wonder, like how much support staff in your time,

00:42:45
says, I'm skeptical that I would need to see ya insulted Dean

00:42:49
McKay, I talk about that your time.

00:42:52
This is like the game Katie fire, well, that would get rid

00:42:55
of one employee at the, I mean, that, like, consolidation, Is

00:42:59
clearly both from advertising being taken up by the companies

00:43:03
as well as the decisions of some journalistic Outlets.

00:43:06
Plus just the success of the times of building up an audience

00:43:09
that is willing to pay all this money but is a smaller smaller

00:43:12
audience, right? So that's what sub s

00:43:14
demonstrate. I mean, the New York Times thing

00:43:15
is the same as the subject demonstration, which is if you

00:43:17
have a small number of people who love you, then, that is way

00:43:20
more profitable than making it, you know, trying to make it in a

00:43:24
per click on an ad basis, right? But our subscription base is

00:43:27
growing. It's not a small number of

00:43:29
People who are subscribing so it's hardcore.

00:43:32
Dedicated you mean compared to like the entire universe of

00:43:34
people who use Facebook for TV? What I'm saying is like you have

00:43:38
a there's a small number of people who really all of these

00:43:42
media outlets and moved to subscription are absolutely

00:43:44
motivated to keep their small number of subscribers have.

00:43:46
Okay. I think we have more than 8

00:43:47
million subscribers. I don't think it's like the

00:43:49
tiniest number and I was just trying to be clear.

00:43:52
I get the times I'm you know but like I just have to like we're

00:43:55
all liberal Elites Coastal Elites, right?

00:43:58
And don't you trust At the most, this is what I want to ask.

00:44:00
What do you trust like, what if not the times?

00:44:03
What, you know, like oh I do trust the times most of the time

00:44:06
except like, I'll read the story.

00:44:07
I mean, again, there's a difference between like things

00:44:10
being factually accurate or not and things being applied in the

00:44:13
right context or the amount of coverage they get.

00:44:14
Right? Yeah, I understand.

00:44:16
I mean, like, this idea of resistance journalism, I'm

00:44:19
opposed to it just in practice, just, that's just me.

00:44:23
I mean, if you look at most of the things that came out of the

00:44:25
Frances Hogan stuff, you can agree or disagree with the

00:44:27
conclusions and, and either they, you know, illustrate a

00:44:31
completely fucked up company. Or the question is, like, what

00:44:34
should they just have released that information publicly?

00:44:36
And we could have seen, oh, there were research internally

00:44:39
at Facebook, showing that there a percentage of young girls that

00:44:42
use Instagram that are unhappy. Obviously, you don't want to put

00:44:45
out the internal deliberations at the New York Times, but some

00:44:48
sort of public discussion of a decision as to why they chose

00:44:52
not to run certain things would sort of benefit people to

00:44:56
understand what the process is that, you know, Separates a

00:45:00
mainstream news organization from one that has no

00:45:03
credibility. I mean, the Facebook side.

00:45:05
I think absolutely, they should publish it.

00:45:07
We go two directions from this one either.

00:45:09
Tech companies start to publish these things because they

00:45:12
realize that doing it themselves.

00:45:14
Makes it not a scandal like a bunch of these reports are just

00:45:16
reports in like there are good things and bad things like the

00:45:18
Instagram one. Honestly is both sides and so

00:45:21
Facebook, published that it would not be seen as like this

00:45:23
huge Scandal for what you need. All these investigations.

00:45:25
We actually launched a journal at Stanford which was totally

00:45:27
coincidental at the hog and stuff but I It's pretty the

00:45:29
Journal of online trust and safety specifically.

00:45:31
Because we want to get platforms to published peer-reviewed

00:45:34
research in places that they can interact with academics and

00:45:37
others. So I think that's a good thing.

00:45:39
There's also the possibility from here is that companies just

00:45:41
don't do the research because the truth is that Facebook has

00:45:43
more people working on this than the rest of the industry

00:45:45
combined and is now being punished for that fact.

00:45:47
And so there's a possibility that if you're at Tick-Tock,

00:45:50
you're like, oh no, never look, never create a document that

00:45:53
says anything is bad because eventually if it leaks it

00:45:56
becomes a scandal and it's better not to look to go.

00:45:59
Also to like kind of the 2016 story.

00:46:01
Facebook is not the largest Advertiser on the internet, it

00:46:04
turns out, not even by a long shot, and there's another large

00:46:08
company. I'm sure smoogle that had lots

00:46:11
of stuff going on that quietly found, it never told the public

00:46:15
never told molars team never told anybody because they're not

00:46:17
legally required to what you think.

00:46:19
Google is worse than Facebook and Russian interference in the

00:46:21
2016 election. I don't I don't know because we

00:46:25
don't know anything and And since then their playbook has

00:46:30
been to not share anything, right?

00:46:33
So like Facebook has crowd tangle which might get killed.

00:46:35
So I like crowd angles is fantastic platform.

00:46:39
That is super useful and was incredibly key to our 2020 work

00:46:43
and all the signs look, like Facebook's going to kill it.

00:46:45
But as of right now, Facebook has incredible transparency.

00:46:48
Twitter has incredible transparency.

00:46:50
YouTube is almost impossible to study as an outside researcher.

00:46:54
Tick-Tock is almost impossible to say, he's outside researcher,

00:46:56
and so, I think we're going to need to have mandated.

00:46:59
Here. Because what's happened so far,

00:47:01
is that the companies have opened up.

00:47:02
Have had a huge amount of criticism and there's never any

00:47:04
criticism the companies that have decided not to open up.

00:47:06
And as long as that Dynamic continues to exist, I think

00:47:09
we're going to have this lopsided coverage but also the

00:47:11
inability to fix some of these problems.

00:47:13
So guys, I'm sorry. I have the time out.

00:47:15
My daughter, schools about to call her the rubble.

00:47:16
So we're about 15 minutes so I can continue this conversation.

00:47:18
15 minutes or we've got to wrap it up.

00:47:20
I'm so I think we've gone long. Sure I give it like a little

00:47:23
goodbye. See ya.

00:47:25
Yes thanks so much for having me.

00:47:26
All right. We're like what's a concluding

00:47:30
Point earlier that's not to say? Well Alex I think we've all

00:47:32
agreed that Facebook and the media companies are all the same

00:47:36
that we all need to get in the same ship together and and and

00:47:39
figure it all out. They're not all the same.

00:47:41
I think what we have write my conclusion would be we have

00:47:45
fundamental weaknesses in our media environment as well as

00:47:48
like the psychology of Americans and we have to look at the big

00:47:51
picture if we're going to understand or deal with any of

00:47:52
these problems. Thank you so much for having me.

00:47:54
I'll send you guys the recording.

00:47:55
Sounds good, thank you. Thank you so much Alex.

00:47:57
We really appreciate it. Goodbye,

00:48:09
goodbye. Goodbye, goodbye, goodbye,

00:48:12
goodbye. Goodbye.