Elon Musk is the liberal elite’s enemy of the moment.
How quickly the bad blood for Mark Zuckerberg is forgotten.
When Zuckerberg’s Meta released Twitter rival Threads, reporters and left-leaning types (myself included) flocked to the new app as a potential refuge from Musk’s Twitter.
The enemy of my enemy is my friend seemed to be the logic of the moment.
I invited Facebook whistleblower Frances Haugen onto the podcast to discuss the sudden embrace of Threads, her ongoing criticisms of how Facebook operates, and her new book, The Power of One.
Haugen, for one, has not forgotten the problems with Facebook. She hadn’t downloaded Threads.
I said on the podcast, “As a reporter, it’s funny to see the reporter class embracing Threads at the moment when two years ago, or even more than that, they would have been so negative and apprehensive about trusting Facebook. I’m just curious watching the pretty upbeat response to Threads, what do you take from that and are you surprised there seems to be some media trust for Facebook right now.”
Haugen was empathetic toward people fleeing Twitter for Threads.
“I think it’s one of these things where the trauma the Twitter community has faced in the last year is pretty intense,” Haugen told me. “People really liked having a space to discuss ideas, to discuss issues, and the idea that they could have a space again feels really good.”
We spent much of the episode getting into the particulars of The Facebook Files and her criticisms of Facebook.
She outlines a core critique in The Power of One’s introduction:
One of the questions I was often asked after I went public was, “Why are there so few whistleblowers at other technology companies, like, say, Apple?” My answer: Apple lacks the incentive or the ability to lie to the public about the most meaningful dimensions of their business. For physical products like an Apple phone or laptop, anyone can examine the physical inputs (like metals or other natural resources) and ask where they came from and the conditions of their mining, or monitor the physical products and pollution generated to understand societal harms the company is externalizing. Scientists can place sensors outside an Apple factory and monitor the pollutants that may vent into the sky or flow into rivers and oceans. People can and do take apart Apple products within hours of their release and publish YouTube videos confirming the benchmarks Apple has promoted, or verify that the parts Apple claims are in there, are in fact there. Apple knows that if they lie to the public, they will be caught, and quickly.
Facebook, on the other hand, provided a social network that presented a different product to every user in the world. We— and by we, I mean parents, children, voters, legislators, businesses, consumers, terrorists, sex- traffickers, everyone— were limited by our own individual experiences in trying to assess What is Facebook, exactly? We had no way to tell how representative, how widespread or not, the user experience and harms each of us encountered was. As a result, it didn’t matter if activists came forward and reported Facebook was enabling child exploitation, terrorist recruiting, a neo-Nazi movement, and ethnic violence designed and executed to be broadcast on social media, or unleashing algorithms that created eating disorders or motivated suicides. Facebook would just deflect with versions of the same talking point: “What you are seeing is anecdotal, an anomaly. The problem you found is not representative of what Facebook is.”
To jog your memory for the episode, in September 2021, the Wall Street Journal published the first in a series of articles, called the Facebook Files, about the company’s cross check program, which gave special treatment to high-profile users when it came to the company’s moderation decisions.
The Journal followed that report with a story about how Facebook’s internal research showed that 32% of teen girls said “that when they felt bad about their bodies, Instagram made them feel worse.”
The third story in the series showed that Facebook’s decision to preference “meaningful social interactions” seemed to have the opposite effect, giving more reach to posts that instigated conflict and anger.
Perhaps most damning in my mind, was the Journal’s fourth story in the series which showed that Facebook had failed to implement internationally many of the table stakes moderation practices it applies in the U.S.
The Journal won a Polk Award for its reporting.
I have at times been skeptical of how damning these stories were.
It’s not that crazy to me that Facebook would want to provide extra attention toward moderation decisions for public figures.
Is Instagram harming teen girls more than Vogue or Cosmo?
So it was fun to finally hash out some of these issues with Haugen on the podcast.
Ultimately, I think we were mostly aligned that we both support much better disclosure requirements for Facebook. Regulators are fighting with both arms tied behind their backs.
I was disappointed, however, that Haugen seemed to bend over backward to come off as apolitical in her critique of Facebook. She didn’t really engage in the obvious political asymmetry: Republicans are clearly much more likely to post the type of content that Democrats would call misinformation.
I think that’s a fair statement whatever you think of “misinformation.”
Anyway, that should give you enough context to dig into our conversation. Enjoy!
Give it a listen
Highlighted Excerpts
The transcript has been edited for clarity.
Eric: How would you see a disclosure regime working that still allows companies like Facebook to be flexible and to change?
Frances: I think a lot of people don’t sit and think about what’s the menu of options when it comes to intervening in a problem as complicated as this. I’m really glad that you brought up the idea that these companies’ grow and change, where the next one to come along might not fit the exact same mold of this one. One of the ways the European Union handles that flexibility — and to be really clear, this kind of way of doing regulation of saying disclosure and transparency is instead of something like what’s happening what’s happening in Utah, where Utah is coming in and saying, “This is how you will run your company.” If people are under 18, they have to have parent supervision, no privacy for kids, their parents can see everything — or like Montana coming out and just flat out banning TikTok. Those are kind of “building fences” type roles, where we’re like, “Oh, this is the fence you can’t cross.” And the thing about technology is it moves and changes, and they’re very good at running around fences.
So the alternative is something like what the European Union passed last year, which is called the Digital Services Act. And the Digital Services Act says, “Hey, if the core problem is a power imbalance, right, the fact that you can know what’s going on and I can’t, let’s address that core problem because a lot of other things will flow downstream from it.” So they say, “Hey, if you know there’s a risk with your product, you need to tell us about it. If you discover one, if you can imagine one and you tell us about it. You need to tell us your plan for mitigating it because it’s going to be different for every platform. We want it to unleash innovation. And you need to give us enough data that we can see that there was progress being made to meet that goal. And if we ask you a question, we deserve to get an answer,” which sounds really basic, but it’s not true today.
Eric: Some of these problems that you've identified are just human problems. If you talk about sort of the Instagram critique with it, potentially making sort of young teenage women — some segment of them unhappy. I mean, you could say, like, was that so different from Vogue? Is this really an algorithmic problem?
Haugen: There have always been teen girls that were unhappy about their bodies or how nice their clothes were. But there are a limited number of pages of Vogue every month. The second time you read Vogue, you’re going to have a different impact on you than the third time you read Vogue. Or you’re going to get bored of it? And in the case of something like Instagram, Instagram progressively pushes you towards more and more extreme content.
…
With a 13-year-old girl, she might start out by looking for something like healthy recipes. And just by clicking on the content get pushed over time towards more and more extreme materials.
Eric: Why did you decide to come out and reveal your identity?
Frances: I had been contemplating for quite a while would I have to come forward at some point. I had a chance to talk to my parents about it a large number of times just because what I was seeing on a day-to-day basis while I lived with them during COVID was so different from Facebook’s public narrative was on these issues. But the moment where I was like “okay, I have no other options” was right after the 2020 election — so this was in December, less than 30 days after the election — they pulled us all together on Zoom and said, You know how for the last four years, the only part of Facebook that was growing was the Civic Integrity team. So it was the team, for Facebook.com aimed to ensure Facebook was a positive social force in the world, that it wasn’t going to disrupt elections, it wasn’t going to cause any more genocides because by that point there had been two. They said, Hey, you are so important; we’re going to dissolve your team and integrate it into the rest of Facebook.
And when they did that, that was kind of the moment where I realized Facebook had given up. That the only way that Facebook was going to save itself was if the public got involved. That the public had to come and save Facebook.
Get full access to Newcomer at www.newcomer.co/subscribe
00:00:01
Hey, it's Eric Newcomer. Welcome to the Newcomer Podcast.
00:00:04
A really exciting episode this week.
00:00:07
We're talking to a whistleblower, Francis Haugett,
00:00:10
the Facebook whistleblower is on the show.
00:00:12
I feel like we've been building to this for a literal years.
00:00:15
I've been talking about Facebook's content moderation,
00:00:18
the implications of her release to the Wall Street Journal of
00:00:21
the Facebook Files, and her decision to go public.
00:00:24
And we really got into the substance of it.
00:00:26
I feel like so much of the discussion around social media.
00:00:29
Moderation is sort of built for a simplistic public consumption,
00:00:34
and I really tried to get into the meat of these issues and
00:00:37
what potential policy solutions there are.
00:00:42
So give it a list. Francis Alga and Facebook
00:00:45
whistleblower, Welcome to the Newcomer podcast, author of The
00:00:48
Power of 1. Thanks for joining me.
00:00:51
Thank you for inviting. Happy to be here.
00:00:53
I have followed your story, the Wall Street Journal reporting
00:00:56
now the book super closely. So really excited to have the
00:01:00
conversation. You know, the Facebook Files
00:01:04
series and sort of the complaints about Facebook can be
00:01:08
so sprawling, like there's so much to it.
00:01:12
So I just wanted to start off the conversation.
00:01:14
If you had to sum up the top Facebook sin or you're like main
00:01:19
core objection, what is your top?
00:01:22
Facebook objection. So I would say the through line
00:01:25
that goes through all of the different kind of shocking, you
00:01:28
know, headline, exclamation mark type things that came out of the
00:01:31
Facebook files. And to remind people who maybe
00:01:34
don't remember exactly what happened about two years ago,
00:01:36
you know, it ranged everything from Facebook.
00:01:38
We knew they were human traffickers on the platform and
00:01:41
worried more about offending the buyers than the people being
00:01:45
sold. Or Facebook had lots of research
00:01:48
on their products, making kids unhappy, making them depressed.
00:01:51
Facilitating eating disorders, refused to hand it over to the
00:01:54
Senate when asked. You know what On and on and on.
00:01:57
But the thing that was the common through line, I would say
00:02:00
through all those things is we live in a world where Facebook
00:02:04
has to report publicly it's profit and loss numbers, It's
00:02:08
expenses like how it got to that profit and loss, but it doesn't
00:02:12
have to report a social balance sheet, you know, like what's the
00:02:16
social costs of making those dollars and for most other
00:02:19
industries. Those public costs, those
00:02:22
externalities are very obvious. You know, it's pollution in a
00:02:25
river, it's pollution in the air, it's forced labor.
00:02:28
You know, we can see what's going on with digital products.
00:02:33
You can take from the social side of the balance sheet and
00:02:36
make your economic balance sheet look a lot better.
00:02:39
And that's the core problem that is repeated over and over again
00:02:42
across all of these, all these harms.
00:02:44
Yeah. And I mean, that really sort of
00:02:46
comes through in the book. I mean, you make the point that
00:02:48
like, you know, Apple, it would be sort of visible with a
00:02:51
hardware device, whereas with this social media company like
00:02:54
Facebook, it's less visible. Is it a matter of the fact that
00:02:59
Facebook is customizing feeds from person to person or is it
00:03:04
more than that, what makes Facebook's products so much,
00:03:07
like, less visible than something more tangible?
00:03:11
The point I make in the book is if you go search for Apple
00:03:14
whistleblower, you don't find a lot.
00:03:17
Like, there are occasionally Apple whistleblowers, but
00:03:19
they're not like Facebook whistleblowers.
00:03:21
Like I I always kind of bristle when people describe me as the
00:03:24
Facebook whistleblower because there's a new Facebook
00:03:26
whistleblower every two weeks, you know, often with really big
00:03:30
revelations. It's not like there's one person
00:03:32
in the company that's feeding all that out there.
00:03:34
It's like a whole range of conscientious people who know
00:03:37
the public need to be brought in.
00:03:39
So you can ask the question, like, why is that?
00:03:41
Like, why is it Apple isn't producing the same volume or
00:03:44
intensity of whistleblowers, But I think it's that when Apple
00:03:48
launches a new iPhone, you know, within a couple of hours of that
00:03:52
iPhone going on store shelves. Don't be YouTube videos live
00:03:57
like already live where people have taken apart those phones
00:04:00
and confirmed like yes that processor is in there.
00:04:03
Right. They'll be like, oh, they added
00:04:04
a hard drive and now the hard drive slightly slower because of
00:04:08
it. Yeah, exactly.
00:04:10
It's very visible to people. But when it comes to what I call
00:04:13
opaque systems. And more and more of our economy
00:04:17
is going to be run by opaque systems.
00:04:19
So that's things like a large language model that lives on a
00:04:23
data center and all you see is the outputs, you don't see the
00:04:26
inputs, you don't see out there manipulated.
00:04:28
It could be a social media site. So like you said before is the
00:04:31
problem because we all see something slightly different.
00:04:35
That's definitely a huge part of it.
00:04:37
So if we were talking about Google, have you ever programmed
00:04:40
before? No, I'm not a programmer.
00:04:42
I know you're not going to believe me when I say this, but
00:04:44
if you and I sat down for three weeks, I could teach you enough
00:04:48
programming that you could ask real, meaningful questions about
00:04:52
how Google works. You know, what's getting
00:04:54
distributed? What's not getting distributed?
00:04:56
How prominent are different kinds of things?
00:04:58
What kinds of answers does Google give if we want to do the
00:05:02
same kind of accountability? 101 for Facebook?
00:05:06
Free Instagram. Don't even mention tick tock,
00:05:09
tick tock. It's way harder because it's
00:05:10
video. Right.
00:05:11
It's bandwidth intensive. You know how much hard it is to
00:05:14
a video podcast than an audio podcast?
00:05:17
We'd have to recruit 20 volunteers and convince them to
00:05:21
install software on their phones that would send back what they
00:05:25
saw on these platforms. It's a an entirely different
00:05:30
order of magnitude in terms of difficulty, and Facebook knows
00:05:34
that. With TikTok, sort of.
00:05:36
I really understand it right where it feels so tailored to
00:05:39
the human being. And it's.
00:05:41
Sort of machine learning driven where it feels like is there
00:05:44
even someone over at TikTok who really gets exactly why I'm
00:05:48
getting what I am? I mean but Facebook at least in
00:05:50
the beginning was supposed to be more like network based and like
00:05:54
friend based. Which?
00:05:55
Felt like, yeah, easier to trace.
00:05:58
Do you think it's totally moved away from that or why?
00:06:02
Why is Facebook so much harder to follow than at Google?
00:06:07
So by the time I got to Facebook in 2019. 60% of everything
00:06:12
people in the United States viewed came from a Facebook
00:06:16
group. It wasn't content from your
00:06:18
family and friends anymore. It's one of these things where
00:06:21
one of the projects my nonprofit is going to work on, let's go
00:06:24
beyond the screen because we help people see beyond their
00:06:28
screens, is doing a simulated social network.
00:06:31
Because, you know, it's like you brought up before a lot of
00:06:34
people if you were to stop them on the street and say, what is
00:06:37
Facebook for? And they would say it's about
00:06:39
connecting with my family and friends.
00:06:41
But if you were to actually go and look under the hood that
00:06:46
products, you know, that product where the only things you saw in
00:06:49
your feed were things that your family and friends posted, that
00:06:52
world is long gone. And the reason why it changed
00:06:55
was Facebook is an advertising supported business.
00:06:59
You know, as you scroll, you see an ad, they make money, you
00:07:03
click on that ad, they get money.
00:07:05
They have a motivation to get you to conceal more and more
00:07:09
higher value ads every single month.
00:07:11
You know, the more time you spend on the platform, the more
00:07:14
content you view, the more ad dollars come in.
00:07:18
They had a huge motivation to figure out how can they make
00:07:21
sure your feed is always full of really stimulating content.
00:07:25
What's interesting about TikTok is TikTok said.
00:07:27
Hey, we don't want to wait to get critical mass.
00:07:32
You know, in the case of Facebook or like creating a
00:07:34
Facebook competitor, when the expectation from the user is the
00:07:38
things that I see come from my family and friends, you have the
00:07:42
like a chicken and egg problem where you need to get all of the
00:07:46
person's friends before the person wants to join, right?
00:07:49
Like how do you make that happen with something like TikTok or
00:07:52
now Threads? Threads has a very similar
00:07:55
model. And that's Facebook's new
00:07:56
Twitter competitor. There's much less of a user
00:08:00
promise of what you're going to see.
00:08:02
So when you show up, they don't have to wait for critical mass.
00:08:06
They just need to entertain you. But at the same time, now you're
00:08:09
putting yourself in the hands of an algorithm you know, You
00:08:13
better understand what the biases are in that algorithm,
00:08:16
what it shows you and doesn't show you, because now the
00:08:18
algorithm is in control, not your friends and family.
00:08:21
You brought up threads and I I too see the similarity with
00:08:25
TikTok where it's, you know, it's bite size.
00:08:27
You know, it's not as tied to even like a follower network as
00:08:31
Twitter because they needed it to work right away.
00:08:34
So it feels like, oh man, this is going to be really powered by
00:08:37
machine learning pretty quickly. Like as you know a reporter.
00:08:41
It's funny to see sort of the reporter class sort of embracing
00:08:45
threads at the moment when I feel like.
00:08:47
You know two years ago or more than that, they would have been
00:08:50
so negative and apprehensive about trusting Facebook.
00:08:55
I'm just curious like watching the sort of pretty upbeat
00:08:59
response to threads, what do you take from that?
00:09:02
And like, are you surprised that there seems to be some Media
00:09:07
Trust of Facebook right now? I.
00:09:09
Think that's probably a sense where the trauma the Twitter
00:09:13
community has faced in the last year?
00:09:16
It's pretty intense. You know, it's one of the things
00:09:18
where, you know, Twitter never really went down that often, now
00:09:21
goes down regularly. It used to be, you know, as Jack
00:09:24
Dorsey, the former CEO, has pointed out recently, you know,
00:09:27
it used to be basically when you posted, things went live
00:09:30
instantaneously. And now they're, you know, he
00:09:32
flagged that they were now, you know, a minute or two delays
00:09:35
between when you posted and when it went out to people.
00:09:38
I think Elon Musk really gave himself an incredibly hard hill
00:09:42
to climb when he bought the company using so much debt.
00:09:46
Because when that happened, it meant that he put a clock on
00:09:49
himself. You know, he has to make a
00:09:51
billion dollars of profit a year and a product that was maybe
00:09:55
breaking even before he bought it, It was really hard.
00:09:59
And he's had lots of advertisers flee.
00:10:01
So I think there's further the positive reporting on threads is
00:10:05
that people really liked having a space to discuss ideas, to
00:10:10
discuss issues and the idea that they could have a space again.
00:10:15
Feels really good. Just to say another way, it's
00:10:17
like people were so frustrated with Twitter that give them some
00:10:21
openness to trust Facebook again.
00:10:23
Or Meadow, whatever we're calling it these days anyway.
00:10:26
Continue. It's interesting, like this is
00:10:28
still true, but people were reporting the, you know, the
00:10:31
first day to of threads. You weren't even given the
00:10:34
option to just receive content from people you chose.
00:10:38
Like it really is like talking that way where you have to put
00:10:42
yourselves in the hand of the algorithm.
00:10:44
And be like algorithm. May I please?
00:10:45
Yeah, there's no follower feed. Does that mean you haven't
00:10:48
downloaded it? Are you personally?
00:10:49
I have not. I have not.
00:10:51
Isn't that I'm unwilling to download it.
00:10:52
I just haven't got around to download it yet.
00:10:54
And I as someone who worked on Google Plus.
00:10:57
So for your listeners who may not remember, you know tech 12
00:11:00
years ago. Yeah, Google said, hey, there
00:11:04
are flaws with Facebook, like we should make a version of
00:11:06
Facebook that doesn't have those issues.
00:11:09
And in reality, you know, they were trying to go after the
00:11:11
personal social media market that's like you can actually
00:11:14
with people, you know, but they because they wanted to grow so
00:11:17
fast, they basically made something very similar to
00:11:19
Twitter. And you know, growing too fast
00:11:22
is actually a problem because it means that when people go and
00:11:26
take that chance on you, they don't land in a functioning
00:11:31
community. They land in chaos.
00:11:32
And they don't really understand what the point is.
00:11:34
They don't understand what the point of this place is, doesn't
00:11:37
feel like a place and they move on.
00:11:39
And at least some of the initial data around engagement, It seems
00:11:42
like that's a very similar problem to what Facebook is
00:11:44
facing right now. People were willing to download
00:11:46
it with threats, yeah. I mean, one thing Facebook did
00:11:50
differently is they, like, prepopulated it with all these
00:11:53
celebrities. Yeah.
00:11:54
So they had that. And they had obviously the
00:11:57
Instagram sort of connection to flow people in.
00:12:00
But I agree there's a lot of similarity, especially and, you
00:12:02
know, I mean, what's is it Google Plus, man, I can't even
00:12:05
remember the name. You just said it.
00:12:07
Google Plus. Yeah.
00:12:09
They like, I mean they got a ton of users in the beginning I, you
00:12:12
know, I was on it for a second and then it sort of faded.
00:12:15
So it's very possible. I mean Threads takes that same
00:12:19
arc, I don't think that's my personal view.
00:12:21
I mean is that a prediction or do you think this will be like a
00:12:25
Google Plus situation? I think one of the things that's
00:12:27
interesting about Twitter is the way Einstein's like to frame
00:12:31
Twitter is. Twitter is fueled by content
00:12:35
from a relatively small fraction of its users.
00:12:38
Who are most interested in the thoughts of that other small
00:12:42
fraction of users? So I talked to a law professor
00:12:46
who is regularly cited in tech discussions and she was like I
00:12:50
use Twitter to hear from 300 other people and just happens to
00:12:54
be that those people talking to each other and caring about what
00:12:58
each other says everyone else gets to kind of be a fly on the
00:13:01
wall and follow along for the conversation.
00:13:03
There are people like Elon Musk who love like raising up their
00:13:06
followers and like having direct conversations with a wide spot
00:13:09
that people. But I'd say the fuel, the real
00:13:12
core of Twitter is those, you know, communities of connection
00:13:18
and it'll be really interesting to see if threads can maintain
00:13:21
that. Like is it just going to end up
00:13:23
kind of a brand safe By brand safe, I mean for, you know even
00:13:27
sell Tide, Washington, GA. And Clorox, you know, bleach
00:13:31
one. Yeah, they're very programs,
00:13:33
definitely. Really.
00:13:34
Is it going to be just a space that is, like innocuous?
00:13:37
Or is it going to be a space that really does cultivate
00:13:39
community the way Twitter did? I mean it also, you know, it
00:13:43
feels like you could have one community on threads and one on
00:13:48
Twitter, especially with the part, you know, like I mean
00:13:51
you're seeing Elon right now, he's like making these payments
00:13:54
out to. Twitter personalities, it feels
00:13:57
like extremely conservative. I mean you could just see sort
00:13:59
of more left wing liberal, sort of, you know, people on threads
00:14:04
and sort of a right wing Twitter world.
00:14:07
It's very possible. I can totally imagine that
00:14:09
happening. It makes me nervous because like
00:14:11
when we do move into a space where we are entirely dependent
00:14:15
on an algorithm like, I have no idea if I post on threads like,
00:14:20
will my posts even get distributed?
00:14:22
You know, like my publisher couldn't even tag it to sell the
00:14:25
book, right? So it's things like that.
00:14:28
Wait, you think Facebook is like actively suppressing your book
00:14:33
on their platforms or? I know that for events I've
00:14:36
done, they couldn't use the word Facebook to describe the event.
00:14:40
So I can't be the Facebook whistleblower.
00:14:42
I can be a whistleblower. Like, you know, they'll have to
00:14:44
sit there and try a bunch of different variation of ads to
00:14:47
run an ad I was told by my publisher was, you know.
00:14:51
They sell lots of books. They post about lots of books on
00:14:54
Instagram. They went through and tried to
00:14:57
tag my book and they got back an e-mail saying that it violated
00:15:00
the commerce policy for Instagram.
00:15:03
And they paint their like, concierge because they spent
00:15:07
enough money on ads that they get, you know, a human shot to.
00:15:10
And the person was like, I don't know why this would violate the
00:15:13
commerce policy. I mean, there's not white
00:15:15
community. There's no violence, you know,
00:15:17
whatever. And yeah, so they they don't
00:15:20
know why. I mean, do you think Facebook
00:15:23
just has a blanket ban on people using the word Facebook?
00:15:26
Ooh, I don't know. I've had other experiences, like
00:15:29
the Nobel Peace Center had an event that I spoke at, and they
00:15:34
had never had an ad blocked for being a political ad until they
00:15:39
advertised my event. So think about that.
00:15:41
Like, do you think the events at the Nobel Peace Center are
00:15:43
political? I think most of all are.
00:15:46
Right. So.
00:15:48
It is what it is. Have you interacted much with
00:15:50
Facebook otherwise or like? Have you had any direct
00:15:53
engagement since you left? I think if I was more of a
00:15:57
troll, you know I have a little bit of troll, not like a huge
00:16:00
amount of troll in my heart. I have like a enough troll for
00:16:03
spark, right. I would totally promote them
00:16:06
more because they so aggressively will not
00:16:10
acknowledge that I exist. So if like for example, if you
00:16:13
ever get to be in a Q&A session with a Facebook executive like
00:16:17
at a conference. Ask them about me because they
00:16:20
will not use my name. So that's the kind of thing
00:16:22
we're like. If I was a bigger troll I would
00:16:24
see more questions at their events, but you know, I have
00:16:27
other things to do so. Back to sort of the core
00:16:31
disclosures, you know, in the book and I mean clearly you talk
00:16:35
about sort of them lying and I'm just curious like what you think
00:16:40
sort of the core or like great examples of like Facebook's
00:16:44
deceit? Have been because I mean there's
00:16:46
obviously they have such an information advantage where it's
00:16:49
like, yeah, they can just run circles around anybody trying to
00:16:52
scrutinize them because they understand the platform so much
00:16:55
more. But the cases where they were,
00:16:57
you know, lying. Flagrant so I'll give you an
00:17:00
example. This is one of the core parts,
00:17:03
and I complain. Back in 2018, Facebook faced a
00:17:06
business problem. You know, they could see over
00:17:09
time people were making less and less content.
00:17:12
This is a normal phenomenon on social networks like some people
00:17:15
get really into it. Most people slowly start to self
00:17:18
edit self censor and they tried a bunch of experiments on you
00:17:23
and I to see could they induce us to produce more content.
00:17:29
And one of the things I know, it's like guess what your
00:17:31
experiments are on every day every time you're on one of
00:17:33
these platforms. So it turns out if they
00:17:36
artificially. Get you twice as many likes and
00:17:40
the way they do this is they just keep showing your your post
00:17:43
to more and more people until you get to that number of likes.
00:17:46
If they can get you up to you know twice as many likes you
00:17:50
produce more content. You know it's a very clear like
00:17:54
dose response that if you feel more social rewards more little
00:17:58
hits of dopamine you make no. So they came out and said we
00:18:03
don't want people mindlessly scrolling.
00:18:06
You know, mindlessly scrolling, that's bad for people.
00:18:09
So we're no longer going to prioritize content on Facebook.
00:18:13
They start how much time you're going to spend on the platform.
00:18:16
That's like, you know, it could be a prophecy for like how much
00:18:19
you enjoy consuming things. Instead, we're going to reward
00:18:24
meaningful social interactions. So I think about a phrase like
00:18:28
that, it's like, what is a like to you?
00:18:29
What is a meaningful social interaction?
00:18:32
Something that, you know, like a day later I'd reflect on and
00:18:35
say, oh, that was like a good thing.
00:18:37
And I'm like glad it happened you.
00:18:39
Know. Or maybe like your friends
00:18:40
shared something really revealing and you wrote a
00:18:43
comment saying, you know, that's hard.
00:18:44
I'm really glad you shared that. You like that sounds like a
00:18:47
meaningful social interaction. In reality it was just was there
00:18:51
any social interaction, right? So you could put bullying or
00:18:56
hate speech in a comment and that would be considered
00:18:59
meaningful social interaction. And within six months of this,
00:19:04
researchers across Europe on the left and the right, local
00:19:07
parties, local parties on the left and the right were telling
00:19:11
Facebook researchers, we know you changed the algorithm.
00:19:15
It used to be we could post like a white paper on our
00:19:18
agricultural policy and we get it.
00:19:20
It's like, not the most thrilling thing in the world.
00:19:22
Didn't get a lot of comments. Well, we could see from the
00:19:25
stats that people spend time with it.
00:19:27
They read it. Now if we post that same kind
00:19:31
of, you know, bread and butter content, nothing.
00:19:33
It's crickets. Like, we're being forced to run
00:19:36
positions that are so extreme that we know our constituents
00:19:39
don't support them. But like, our job is to run
00:19:42
social media. Like we have to, like, put stuff
00:19:44
out there and it has to work or we'll lose our jobs.
00:19:47
And what's kind of crazy about this is in some ways Facebook
00:19:50
acknowledged implicitly they had a problem.
00:19:53
Because they put out, excuse me, Mark Zuckerberg put out a white
00:19:56
paper, probably one of his employees.
00:19:59
It's like 5000 words long. I really doubt Mark wrote it.
00:20:02
It said, hey, this engagement based ranking thing we just
00:20:06
launched, there's a problem, which is people are drawn to
00:20:10
engage with extreme content. But don't worry, don't worry.
00:20:14
And you know, we ask people afterwards, did you like it?
00:20:16
They say, no, we're going to protect you from the lowest
00:20:19
extreme content by taking it down.
00:20:20
We're going to have these magical a I systems.
00:20:23
And even their solution to their first lie was another lie,
00:20:28
because those systems that they claimed we take down all the bad
00:20:31
stuff, they were only successfully removing 3 to 5% of
00:20:34
things like hate speech were less than 1% of violence, right?
00:20:39
It's kind of crazy when you think about it, but they told
00:20:41
everyone from Congress to, you know, the, you know, teachers
00:20:45
unions, like we're protecting people, But in fact, it was all
00:20:49
kind of smoke and mirrors. I think this is the policy
00:20:51
solution you want, or at least the one that would seem to sort
00:20:54
of come from what you're saying is like okay, some sort of
00:20:57
disclosure where, you know, outsiders can sort of track how
00:21:01
this is all happening. I mean, how possible is that?
00:21:05
I mean, every social media network is so different.
00:21:08
You know, I'm a capitalist. I want companies to be able to
00:21:10
change and react and like, you don't want a bunch of laws that
00:21:14
like, say, oh, you need to disclose things this way and so.
00:21:17
Now you're forced with a certain type of the same network you've
00:21:20
had, Instead of evolving, how would you see a disclosure
00:21:24
regime working that still allows companies like Facebook to be
00:21:28
flexible and to change? Let's actually unpack for a
00:21:31
second something you just said, which was, I think a lot of
00:21:34
people don't sit and think about what's the menu of options when
00:21:38
it comes to intervening in a problem as complicated as this,
00:21:41
right? I'm really glad that you brought
00:21:43
up the idea that these companies grow a change.
00:21:46
Where that, well, you know the next one to come along might not
00:21:48
fit the exact same walls of this one.
00:21:50
One of the ways the European Union handles that flexibility
00:21:54
and and to be really clear like this kind of way of doing
00:21:58
regulation of saying disclosure and transparency is in instead
00:22:02
of something like what's happening in say Utah, where
00:22:05
Utah is coming in and saying this is how you will write your
00:22:08
company. If people are under 18, they
00:22:09
have to have parent supervision no privacy for kids.
00:22:12
Their parents can see everything.
00:22:13
We're like Montana coming out and just flat out banning
00:22:16
Tiktok. Those are kind of we'll call
00:22:18
them building fences, type rules where we're like, oh, this is
00:22:21
the fence you can't cross. And the thing about technology
00:22:24
is they it moves and changes and they're very good at running
00:22:26
around fences. So the alternative is something
00:22:30
like what the European Union passed last year, which is
00:22:33
called the Digital Services Act. And the Digital Services Act
00:22:36
says, hey, if the core problem, you know we started at the
00:22:39
beginning of this conversation, if the core problem.
00:22:42
Is a power imbalance, right, Like the fact that you can know
00:22:46
what's going on and I can't know.
00:22:49
Let's address that core problem because a lot of other things
00:22:52
will flow downstream from it. So they say, hey, if you know
00:22:56
there's a risk of your product, you need to tell us about it.
00:23:00
You know, if you discover one, if you can, you know, imagine
00:23:04
one, you need to tell us about it.
00:23:05
You need to tell us your plan for mitigating it, because it's
00:23:07
going to be different for every platform.
00:23:09
We want to unleash innovation. And you need to give us enough
00:23:13
data that we could see that there was progress going on to
00:23:16
meet that goal. And if we ask you a question, we
00:23:20
deserve to get an answer, which sounds really basic but is not
00:23:23
true today. You know, I've been asked
00:23:25
questions by governments around the world that are very basic,
00:23:28
like how many moderators speak German.
00:23:32
I've gotten that question but for different languages all
00:23:34
around the world. And it's because Facebook
00:23:36
doesn't have to answer. They don't even have to answer
00:23:38
things like how many users are there in your country?
00:23:41
Which is extremely frustrating given that Facebook stance has
00:23:44
always been where their move has been like, regulate us.
00:23:47
Like, you know, not all these decisions should be made by a
00:23:50
private company, like. And then it's like, oh, well,
00:23:53
then why don't you give at least the governments around the world
00:23:56
or the information they would need to write good regulations?
00:23:59
So, I mean, obviously you're talking to a journalist, so
00:24:02
disclosure is always gonna be something I'm gonna cheer for,
00:24:04
but certainly a frustrating situation.
00:24:06
To provide a road map, though, so my nonprofit, it's called
00:24:09
Beyond the Screen. One of our core projects is
00:24:12
called Standard of Care where we are working to build out a wiki
00:24:16
that people can contribute to around identifying the problems
00:24:20
of social media. What are the surface areas or
00:24:23
what we call the levers for preventing or mitigating harm
00:24:27
and then what are the strategies for pulling those levers.
00:24:30
So just to give context on that, you know a lot of the problems
00:24:34
around kids. A lever that is common across
00:24:37
them is let's keep under 13 year olds off these systems.
00:24:41
But when the kids advocates talk about technology, they often
00:24:46
don't know what's possible and so they'll settle on solutions
00:24:49
that might seem obvious but have problems.
00:24:51
So for example, in the case of the lever of keeping under 13
00:24:54
year olds off the platform, they'll say let's check ID's,
00:24:57
which I don't think you want, I don't think I want.
00:24:59
It also just doesn't work. But if you've gone to a
00:25:02
technologist and said, hey, I have this lever.
00:25:06
Let's keep under 13 year olds off the platform.
00:25:09
That technologist could come back and say here's 10 or 15
00:25:11
different ways to find under 13 year olds.
00:25:14
You know, some of it's really basic, like kids will say I'm a
00:25:16
fourth grader or things like I learned this one from my
00:25:18
principal. Kids report each other, like to
00:25:21
punish each other. So like you're mean to me on the
00:25:24
playground and I'll report your Instagram account and as soon as
00:25:27
you find 10 kids, you can find all the other ones.
00:25:32
And so the way I think this could tie into transparency.
00:25:35
Is once we have a menu saying these are the harms, these are
00:25:39
the surface areas, the levers for addressing each of those
00:25:43
harms. You can come in and say okay.
00:25:45
Then we at a minimum like I think there should be raw data
00:25:48
assets for researchers. But if you don't want to go that
00:25:50
far, at a minimum you can say we need data on how bad each of
00:25:54
those harms are and how hard you're pulling the levers to try
00:25:59
to reduce those problems. You can figure out lots of
00:26:02
different strategies to pull those levers.
00:26:04
But you need to show U.S. data on things like under 13 year
00:26:07
olds, how late are they on the platforms at night, that kind of
00:26:10
thing. I mean, a core response that
00:26:14
Facebook would give in this situation would just be, they
00:26:17
might not say this outright because it's probably not
00:26:19
political, but like some of these problems that you've
00:26:22
identified are just human problems, right?
00:26:24
If you talk about sort of the Instagram critique with it
00:26:27
potentially making sort of. Young teenage women, sort of,
00:26:32
some segment of them unhappy. I mean, you could say, like, was
00:26:36
that so different from Vogue? Is this really an algorithmic
00:26:39
problem, or is this just like how humans are?
00:26:42
I mean, and I would probably attach more on the a lot of the
00:26:46
sort of Democratic liberal sort of anger at Facebook was about
00:26:51
just what Trump supporters are like and like their views and
00:26:54
the fact that there's like an audience for them.
00:26:56
And not always the fact that Facebook would give them
00:26:59
distribution. So I guess across a lot of these
00:27:02
categories and we can get into the particulars of totally I
00:27:04
mentioned. But yeah, what would you say
00:27:06
about the just like some of these things people are mad
00:27:08
about are just things that human beings do that happen to happen
00:27:11
on Facebook. But it's not necessarily their
00:27:13
levers that are moving people to do those things.
00:27:17
So I think one way to think about this is technology can
00:27:20
either. Amplify and bring out the
00:27:22
worstness. Or it can act as a bridge that
00:27:25
helps us seek our best and happiest selves.
00:27:28
So I totally agree with you that you know there are always been
00:27:31
teen girls that were unhappy about their bodies or how nice
00:27:35
their clothes were, but there are a limited number of pages of
00:27:39
Vogue every month. You know, the second time you
00:27:41
read Vogue, you're going to have a different impact on you than
00:27:44
the third time you read Vogue, or you're going to get bored of
00:27:46
it. And in the case of something
00:27:48
like Instagram, you know, Instagram progressively pushes
00:27:52
you towards more and more extreme content.
00:27:54
So I'll give an example. I had a journalist reach out to
00:27:57
me for an interview and he explained to me that he had just
00:28:00
had a new baby. So this is a healthy, happy baby
00:28:03
boy. He's a modern father.
00:28:06
He made an Instagram account for the baby.
00:28:09
That baby had maybe five or six baby friends.
00:28:12
Everyone here is healthy, happy, cute baby.
00:28:16
He's only ever clicked on or commented on happy, cute,
00:28:19
healthy babies and yet about 10% of his feed with children who
00:28:24
were visibly suffering. So kids who had gone horrible
00:28:27
accidents and were just figured disabilities and deformities
00:28:31
that looked really painful. Kids died of cancer in hospital
00:28:34
beds with tubes coming out of them.
00:28:36
And he was like, Francis, how did we get from healthy, happy
00:28:40
babies to this? Like, what happened?
00:28:43
I've only ever clicked on the Happy Fun tent.
00:28:46
And I was like, well, the A I knows very clearly you like
00:28:50
babies. You know you've made this whole
00:28:51
little baby centric world. So it's showing you content and
00:28:55
knows that people who like babies have trouble not engaging
00:28:58
with. And I want to be honest, even if
00:29:00
you're not clicking like on that content or sad or whatever you
00:29:04
probably are lingering and a lot of these A I's have dwell you're
00:29:09
just you're the fact that you paused is a single signal that
00:29:12
you like that content. And so, you know, he's old
00:29:15
enough and, you know, cognitively mature enough to see
00:29:19
something weird is happening with a 13 year old girl.
00:29:23
You know, she might start out by looking for something like
00:29:25
healthy recipes and just by clicking all the content get
00:29:29
pushed over time towards more and more extreme materials.
00:29:33
And we see these reports from things like child psychologists
00:29:36
who say I have these kids in my practice.
00:29:39
They come into their appointments and say I'm trying
00:29:41
to make better choices. Like I'm trying to follow the
00:29:44
program, but it follows me around Instagram.
00:29:48
And right now we live in a world where we don't have consumer
00:29:51
rights to really basic things like should you be allowed to
00:29:55
reset an algorithm without losing all your past.
00:29:59
So like, those kids are being forced to choose from their
00:30:01
past. You know, all their memories,
00:30:03
their friends and their futures. Because an algorithm wants to
00:30:06
keep them from moving away from something that was hurting them.
00:30:10
I'm sure you saw. You know if you want to delete
00:30:12
threads, they say. You delete.
00:30:13
Instagram too, which is just like insanity.
00:30:17
I mean, it's like, oh, we know you love this other thing.
00:30:19
I'm much more supportive of the we should regulate tech through
00:30:24
laws that, like correct it rather than the sort of huge
00:30:26
antitrust push. I mean to me like things as
00:30:29
simple as just like then push notification hacking.
00:30:32
Like we need to escape a world where like.
00:30:35
Tech companies are allowed to use sort of little badges that
00:30:38
psychologically drive us crazy that don't show, like actual
00:30:42
alerts. But the problem is you write
00:30:44
that rule, you know, like I'm sitting here like I'm frustrated
00:30:47
with like Facebook. And I'm like, why is Facebook
00:30:49
showing me this group that I never use?
00:30:51
But it knows I love to clear algorithms.
00:30:53
We should ban it. But then, you know, for the next
00:30:57
startup that's trying to build and doesn't know anything about,
00:31:01
you know, it's just like, it creates a huge regulatory
00:31:03
burden. They could end up helping
00:31:06
Facebook. So this is part of why we want
00:31:09
to do our standard of care project, right.
00:31:11
Like right now Facebook has a really huge advantage in that
00:31:15
they have done research in this space for decades and decades.
00:31:19
15 years, 17 years somewhere in there I guess they are coming up
00:31:23
on 20 years cuz it's 2023 and they were founded in 2004.
00:31:28
But you want to talk about real deep cuts.
00:31:30
Do you remember Friendster or you not quite old enough?
00:31:33
Yeah, that's. So Facebook was my first real
00:31:37
social media I graduated. You weren't a life spacer.
00:31:39
In 2009 so, but I'm aware of sort of the history.
00:31:42
Of yeah, the first web crawler. So that's a thing for
00:31:47
downloading data off the Internet so you can analyze it
00:31:50
that I ever wrote. I wrote Friendster because I was
00:31:53
using really early graph mapping algorithms because I guess I've
00:31:55
always loved graph based problems.
00:31:58
But yeah, part of why we're doing standard of care is we
00:32:01
know we want to make sure that the next generation of social
00:32:04
platforms and you know, social platforms take lots and lots of
00:32:07
different forms, things like roadblocks.
00:32:10
Roadblocks is a social network. You know, we're going to always
00:32:13
have new social platforms. How can we make sure that people
00:32:16
have the best head start, you know, the best platform to build
00:32:20
off of to be like, oh interesting, we can get a very
00:32:23
robust education on what we should be worrying about.
00:32:26
Very quickly and an understanding of like what
00:32:28
options exist for being able to afford all those problems.
00:32:33
Is privacy a big part of your advocacy?
00:32:36
Or like how optimistic are to me, I'm sort of like, oh man,
00:32:40
privacy sort of lost in this world once you're on them.
00:32:43
Like, I'm just not like a privacy that's I'm much more
00:32:48
aligned on this sort of like, oh man, them hacking our brains and
00:32:51
like, using actual, like, psychological tricks to get us
00:32:53
to engage with them. That's really troubling.
00:32:56
But like, yeah, I don't know how much time are you spending on
00:32:59
sort of privacy advocacy and what's your sort of view on on
00:33:03
that part of? It so privacy is not what I
00:33:05
would say, one of my tentpole issues, I'm very open to it as a
00:33:08
technique in terms of, you know, people emotionally connect with
00:33:12
privacy as an issue and it is a way of decreasing the ability of
00:33:17
algorithms to be able to act towards you.
00:33:19
Like if they're not allowed to record information about you,
00:33:22
it's harder to manipulate you. At the same time, like, you
00:33:25
know, we're developing AIS that are getting better and better at
00:33:28
doing implicit inferences. And so it's this question of,
00:33:31
you know, how little data do you need in order to actually still
00:33:34
see a lot of these phenomena. And also, I think it's only
00:33:37
things where it's like, I don't think you can write privacy laws
00:33:40
that are going to strip enough data to actually neutralize the
00:33:44
problems that we're talking about.
00:33:45
Here. I'll give you a really basic
00:33:46
example. One of the big ways you reduce
00:33:50
misinformation on something like WhatsApp, which is end to end
00:33:53
encrypted, it's private chat is he's saying, hey you know, as
00:33:58
something gets reshared, so there's a chain of reshares.
00:34:01
So I got it, I reshared it. My friend reasshared it on and
00:34:04
on. One is a chain of reshares.
00:34:07
Once it gets 5 hops away from the person who created it, say
00:34:11
to them, hey, you can totally spread this further, but you
00:34:15
have to copy and paste it. You have to make an affirmative
00:34:17
action to continue forward. That kind of change doesn't, you
00:34:22
know, you know, allowing that, not allowing.
00:34:26
That's not really a privacy topic, but it's one of the most
00:34:28
impactful things for safety. So I'm very pro privacy
00:34:32
legislation. If people want to push it, I
00:34:34
think it can have a really positive impact on a lot of the
00:34:37
problems I talk about. But it's not being trinsically
00:34:39
solve all the things that we're talking about here.
00:34:42
No, that makes a lot of sense. I sort of alluded to the
00:34:45
political piece of this sort of digging into that section.
00:34:49
I mean the political asymmetry in the United States and like
00:34:55
how that affects conversations about Facebook.
00:34:58
Like to me it's like pretty obvious that like Trump
00:35:02
supporters, Republicans like are behaving in sort of different
00:35:05
ways than Democrats and spreading generally more false
00:35:09
information. And like, it's very awkward.
00:35:12
For Facebook, certainly. And for politicians or, well, I
00:35:16
don't know, journalists can sometimes sort of have trouble
00:35:19
sort of highlighting that asymmetry.
00:35:22
I don't know. What's your view on it?
00:35:23
Do you agree with me that there's like a sort of partisan
00:35:26
asymmetry there and that that is sort of creating some of the
00:35:30
problems about how Facebook is able.
00:35:32
Facebook doesn't want to target Republicans because then they're
00:35:35
going to get a lot of heat from Republicans.
00:35:37
And so then they're not willing to do sort of things that would
00:35:41
have. Asymmetric outcome, I guess.
00:35:44
Do you agree with that is sort of the question.
00:35:46
Facebook has spent a lot of money trying to frame the issue
00:35:49
of what can be done about any of these online safety bras in
00:35:53
terms of freedom of speech versus safety.
00:35:56
And they get up there, and this is a real quote.
00:35:59
Mark Zuckerberg went on a podcast and he was like, you
00:36:03
know, I've really grown a lot in the last year.
00:36:06
This was like, I don't know, a year after I came out nine
00:36:09
months. I know I've really grown a lot
00:36:11
in the last year because I've realized sometimes if you stand
00:36:15
up for what you believe in, you're going to be unpopular.
00:36:19
And I'm a defender of freedom of speech.
00:36:22
And what I found so aggravating about this is like, you know,
00:36:25
that thing we just talked about with WhatsApp, You know, when
00:36:27
you cut the research chain at 5, You know, if you do that same
00:36:31
thing on Facebook when you cut the research chain at 2:00.
00:36:34
And has the same impact on misinformation as the entire
00:36:37
third party fact checking organization.
00:36:39
And you're not picking and choosing what are the good ideas
00:36:42
and the bad ideas. And I think right now, you know,
00:36:45
a pattern we see across the world is if you are a political
00:36:50
party that is not in power, you have more of a motivation to
00:36:54
figure out new technologies, new ways of reaching people because
00:36:58
you're the party that's out of power.
00:37:00
You know you don't have the advantages of being an
00:37:02
incumbent. And so if we were to roll back
00:37:04
in time to say, the Obama presidency, like the first one,
00:37:08
you know, when Obama got elected, he did a lot of
00:37:11
techniques with technology that no one else was doing.
00:37:15
You know, they were monitoring, you know, what's not as a, B,
00:37:18
testing emails. So they would say, you know, if
00:37:21
Obama holds a puppy, how much money do we make?
00:37:23
If Obama is their first wife, how much money do we make?
00:37:25
You know, do people sign up for the XYZ thing?
00:37:29
He had a a motivation to do that because he was the scrappy
00:37:32
incumbent. And so I think when we think
00:37:34
about these things as like there is a partisan lean that you know
00:37:37
one side is maybe playing the game a little harder than the
00:37:41
other side. I think when I think of it is
00:37:43
that it's just a question of like who it is or isn't in
00:37:45
power. And So what I always try to say
00:37:48
when I speak with right leaning voters or right leaning
00:37:51
podcasters, if you have any suggestions for ones I should go
00:37:54
on, I'm always trying to reach out to a more and more diverse
00:37:57
audience. If your listeners can think of
00:37:59
great ones for me to go on, e-mail me at
00:38:01
francis@francishogan.com. What I would say is, if you care
00:38:06
about freedom of speech, you should be demanding transparency
00:38:10
about these censorship systems. You know, when I talk to women's
00:38:14
rights advocates around the world, all of the have been
00:38:18
kicked off Facebook because these A I systems are so crude
00:38:22
that if you talk about violence against women, the A I thinks
00:38:26
you're committing violence against women.
00:38:29
Like, if you really care about for no space, you should be
00:38:31
marching on the streets for transparency.
00:38:34
And I think that's a space that we should all be willing to work
00:38:36
at. I totally get sort of the
00:38:39
messaging there and that you want everyone to be sort of on
00:38:42
board with these issues. I mean, you look at even the
00:38:44
threads roll out when they launched it, they had some, I
00:38:48
think, tools in there early on that said, oh, you know, this
00:38:52
person, I forget what the exact language was, it was like.
00:38:56
Oh, you know, they, I forget if it was the past false things or
00:38:59
I'm not going to get it totally right.
00:39:00
But there were warnings before people reshared them, which then
00:39:05
like can have sort of a perceived sort of partisan
00:39:08
focus. And obviously, I mean, Elon
00:39:10
Musk, whole campaign on Twitter was the idea that Twitter was
00:39:15
sort of shadow banning, you know, people and the replies and
00:39:19
all that. Yeah, yeah, I don't know.
00:39:22
I guess the question there is like.
00:39:24
Clearly what we're seeing is there is some sort of right wing
00:39:27
backlash to cases where social media companies try to do what
00:39:31
you're saying, which is sort of either flag or not sort of a
00:39:35
size things that violate their policies.
00:39:37
I think this again, it's funny since I actually worry about
00:39:40
this a lot for AI safety. If you already feel distressed,
00:39:44
you know you feel like you have been left behind or you're a
00:39:47
marginalist. And, you know, I grew up in
00:39:50
Iowa. Like, I have a lot of empathy
00:39:52
for Republican voters today. You know, Iowa has been left
00:39:56
behind economically in a pretty dramatic way.
00:39:59
That if you already are feeling a little that anxiety about the
00:40:03
idea that people with power don't really care about you,
00:40:06
it's very easy to read into when a moderation system takes an
00:40:11
impact that you're being singled out.
00:40:13
And I think this is also true for, say, African American or
00:40:16
less affluent people who participate, only that you have
00:40:20
very similar things where like African Americans will get
00:40:23
sanctioned for hate speech because these systems are not
00:40:26
very well done. And so I think there's a fear,
00:40:29
and this is a fear that could be addressed by being more
00:40:32
transparent of saying, hey, we're going to actually let you
00:40:34
see what we're doing because you know the case of a I safety,
00:40:38
you're already seeing people come out with calls of saying I
00:40:41
don't want. Systems that have been aligned
00:40:45
with the public good. I want true or real AI.
00:40:48
And I think that in both cases, either what conservatives are
00:40:52
saying about content, moderation on threats, or AI safety, you
00:40:56
know, when people feel like it's out of their control, when they
00:40:59
feel like something's being done behind the scenes, they object
00:41:02
to it. So giving the user control
00:41:06
basically. Or having enough transparency
00:41:09
that you can build trust on these things.
00:41:12
Right. Like it should be possible for
00:41:13
researchers to come out and say, like, hey, actually, or if you
00:41:16
yourself. So right now, if your content
00:41:18
gets taken down on Facebook, the only people you can appeal to or
00:41:22
you can send your thing off to is the oversight board.
00:41:25
Imagine if you would say, I'd like to be put in a public
00:41:28
research data set. Like, I would like reporters to
00:41:31
be able to look at this and say, oh, interesting, you know, this
00:41:34
whole political candidate is getting taken down at a much
00:41:36
higher rate than this other one. Like that would be a very
00:41:39
different world. Facebook would have to work
00:41:41
harder to make sure that its systems were objective and
00:41:44
effective. In the book Power One which
00:41:48
everyone should go read and you talk about like the rise of
00:41:51
COVID and like I mean COVID sort of is a great way to get into
00:41:55
the misinformation question and I think like how misinformation
00:42:00
like the view on sort of that word and sort of the ideas
00:42:03
there. What is your view on like how
00:42:06
Facebook. Handled COVID, like, are you
00:42:10
supportive of, you know, tamping down on people who were like,
00:42:16
you know, skeptical about the origins of the vaccine or
00:42:19
skeptical about the vaccine? Because that fits right into
00:42:21
this sort of partisan thing where in there end up being sort
00:42:24
of sides that align with parties that fit onto these.
00:42:28
So yeah, how do you score Facebook's handling of COVID
00:42:32
misinformation? Oh, great question, God.
00:42:36
It's interesting. My criticisms on Facebook are
00:42:38
much more like the part of the book that I I think you're
00:42:42
alluding to is, you know, when they went and divided up the
00:42:46
United States into 600 communities.
00:42:50
So think of a community as you enjoy the same kind of groups,
00:42:53
You follow the same kind of pages, you post on the same
00:42:56
kinds of topics, you click on the same kinds of topics.
00:42:58
You know, imagine you put people into communities that were
00:43:01
between 500 people and 3 million people.
00:43:04
If you went and said, you know how many of these communities
00:43:09
make up 80% of all the received COVID misinformation, it turned
00:43:14
out that 4% of the US population fell into communities that got
00:43:19
80% of the COVID misinformation, right?
00:43:22
So you and me, average person gets maybe one piece of COVID
00:43:27
misinformation every couple days, once a week.
00:43:31
For a small fraction of people, they were getting whole streams
00:43:34
of it. That's what happened with the
00:43:35
January 6th people, but for different groups, different
00:43:38
ideas. And and part of that was because
00:43:41
the way Facebook was designed was if you have a post that is
00:43:45
really controversial, you know has a big fight in the comments,
00:43:49
Every new comment makes that post new and can show up at the
00:43:54
top of your home feed again. And so there were a few
00:43:57
communities where, you know, COVID had a really intense
00:44:00
emotional balance. Those communities actively
00:44:04
censored out voices that said anything different and they
00:44:08
became these kind of echo chambers.
00:44:10
And so I think there's interesting conversations to be
00:44:12
had around, like how do phenomena like that occur and
00:44:16
like what other contexts are they happening in?
00:44:18
And so like, let's imagine you're seeing something like
00:44:21
that where you know, the algorithm and the product design
00:44:24
are pushing people towards. These kind of parallel
00:44:27
realities. And like, that's why you had
00:44:29
people showing up and like, you know, threatening teachers with
00:44:32
guns or like people showing up at school board meetings
00:44:35
screaming because literally when they looked at their Facebook
00:44:37
feeds, all they saw was stuff about how teachers were trying
00:44:41
to kill their kids, Right? So that's kind of the
00:44:44
environment we're entering into when we have to say, like, what
00:44:47
is Facebook's role or what should Facebook's policy be?
00:44:51
And I think it's really complicated.
00:44:53
Like, I think they did a bad job.
00:44:56
In that they had, you know, these blacklists, they had the
00:44:59
concepts that you couldn't talk about, but they never told
00:45:02
anyone, you know, They never let us see how well these systems
00:45:06
performed. And it meant that you had people
00:45:08
feel like they weren't, like it was a hidden truth and hidden
00:45:12
truths are very alluring. And so, you know, effective
00:45:16
social software should be designed such that if I write a
00:45:20
thoughtful reply, if I go do research, if I come back to your
00:45:24
allegation with. Something meaningful for a
00:45:26
conversation. I should get similar amounts of
00:45:29
distribution to your inflammatory statement, and that
00:45:32
just doesn't happen today. Like, the systems aren't
00:45:34
designed to reward extreme ideas.
00:45:36
They're not designed to reward thoughtful, moderate ideas.
00:45:39
So. Your solution there would be
00:45:41
just to. To give more distribution to
00:45:45
sort of counter messaging basically.
00:45:48
So like, they've done research inside of Facebook, and one
00:45:50
thing that's kind of exciting about the next few months is
00:45:53
like Harvard has an archive of most of the Facebook files and
00:45:57
they're starting to make access for those documents to academic
00:46:00
researchers. One of the papers in there talks
00:46:04
about how, if you are good at writing such that people diverse
00:46:10
from you different from you. Can engage with that content.
00:46:13
Are those ideas You're doing a very complicated thing, right?
00:46:18
You know, if I can write a political post that Republicans
00:46:21
and Democrats can like instructively have a
00:46:23
conversation in the comments, you know, people are thumbs up
00:46:26
and each other. It's like a positive
00:46:28
conversational template. I should be rewarded for that
00:46:32
because it's not obvious. Like I might not get the most
00:46:35
comments, I might not get the most likes.
00:46:38
But if I can get a diverse community of people, a diverse
00:46:41
audience to engage with it, that should be rewarded.
00:46:44
If you come in there and say, hey, we're going to boost
00:46:47
content from people who can successfully reach across the
00:46:50
aisle, you end up getting less hate speech, less violence for
00:46:54
free, less misinformation. So it's a lot of these things
00:46:58
where if you start making transparent, what is distributed
00:47:02
versus what is created on these platforms.
00:47:06
And you start saying, hey, like right now what you're giving
00:47:09
distribution to is very different than what's being
00:47:11
created. Can we get those a little bit
00:47:13
more in line? I have a feeling what would end
00:47:16
up happening as you'd start finding more techniques like
00:47:18
that where you could come in and say, hey, people who are fear
00:47:22
mongering, you know, they shouldn't be the only ones who
00:47:26
get to stand on the stage. And that's kind of what's
00:47:28
happening today. So it's still a world where like
00:47:32
there's like sort of? A hand on the till in terms of
00:47:36
what's getting reach and like it's just sort of in some ways
00:47:40
just doing things that are less oriented around sort of their
00:47:43
business interests necessarily, which are Max engagement, right.
00:47:46
I mean it's certainly not saying like don't sort of weigh certain
00:47:51
things differently than others, right, Because if you just tried
00:47:54
to create this sort of like neutral algorithm, you would
00:47:57
just then be deferring to sort of.
00:48:00
Negative aspects of humanity potentially.
00:48:02
Am I understanding that correctly?
00:48:04
Yeah, algorithms can only capture the level of complexity
00:48:07
that you put in the algorithm. So if you come in there and say,
00:48:11
hey, people developing compulsive behaviors, right?
00:48:16
So in the case of kids, you know, something like 8% of kids
00:48:20
say that they take control of usage and it's hurting either
00:48:23
their employment, school or their health, right?
00:48:26
Think about that 8%. How selfaware are 14 year olds
00:48:30
like, not super selfaware. How honest are they with
00:48:32
themselves? Not super honest.
00:48:34
It's probably a lot more than 8%, right?
00:48:36
They're suffering like that. You know, if you don't have a
00:48:39
system that says, hey, sleep deprived kids are a longterm
00:48:44
harm to society, you know, they do worse in school.
00:48:47
They are at higher risk for developing mental health issues
00:48:49
that will last so long throughout their life or put
00:48:51
them at higher risk of recurrence.
00:48:53
They're more likely to use drugs, both uppers because
00:48:55
they're tired, Downers because they're depressed.
00:48:58
You know, if you have an algorithm that just is like, how
00:49:00
many clicks can we get? Like how much ad revenue can we
00:49:02
get? You don't capture those kinds of
00:49:05
social costs. And so I think there's a huge
00:49:09
opportunity where if you just come in and say like, hey, with
00:49:12
cars, if you have a car accident, everyone gets to see
00:49:16
the car accident. You know, everyone gets to see
00:49:19
the body on the ground and see that your, you know, your seat
00:49:21
belt didn't work. We don't have a similar feedback
00:49:25
cycle. On these social platforms, and
00:49:28
they can keep kind of having their problems and they don't
00:49:31
have anything that brings them back to center.
00:49:33
For the last part of the conversation, just talk about,
00:49:36
you know, the decision to release information and to go to
00:49:41
the Wall Street Journal. And, yeah, the reporter, it's
00:49:44
Jeff Horowitz, right? He reached out to you on
00:49:46
LinkedIn. Or tell the story a little bit
00:49:48
about how this happened. Like, did you see yourself?
00:49:51
Sure, like as a whistleblower before you heard from him?
00:49:56
I had been contemplating for quite a while.
00:50:00
Like, would I have to go forward at some point?
00:50:03
Like, I had a chance to talk to my parents about it a large
00:50:05
number of times, just because, like, what I was seeing on a
00:50:08
daytoday basis while I lived with them during COVID was just
00:50:12
so different than what the Facebook public narrative was on
00:50:14
these issues. But the moment where I was like,
00:50:17
OKI have no other options was right after the 2020 election.
00:50:21
So this is in December. It's like less than 30 days
00:50:24
after the election. They pulled us all together on
00:50:27
Zoom, and they said, hey, you know how for the last four
00:50:31
years, the only part of Facebook that was growing was the civic
00:50:35
integrity team. So it's the team that was for
00:50:39
facebook.com, at least. So civic Integrity's job was to
00:50:42
make sure that Facebook was a positive force in the world,
00:50:45
like a positive social force in the world.
00:50:47
You know, it wasn't going to disrupt elections.
00:50:49
It wasn't going to cause any more genocides, because by that
00:50:52
point there'd been 2. You know, it was going to be a
00:50:54
positive force. And they said, hey, you are so
00:50:58
important, we're going to dissolve your team and integrate
00:51:01
it into the rest of Facebook. And when they did that, that was
00:51:06
kind of the moment where I realized Facebook had given up,
00:51:09
that the only way Facebook was going to save itself was if the
00:51:12
public got involved, that the public had to come and save
00:51:15
Facebook. And so by chance, so that day I
00:51:19
went and opened up LinkedIn because, you know, I was kind
00:51:22
of. When you have instability at
00:51:24
work, that's what you do. You know, you open up LinkedIn.
00:51:27
And I saw that I had a message from this guy, Jeff Horwitz, and
00:51:33
he did a lot of reporting for the Wall Street Journal about
00:51:35
the violence that Facebook had facilitated in India,
00:51:38
particularly like Muslim Hindu violence.
00:51:41
And, you know, he said, do you want to talk?
00:51:47
And I was like, oh, interesting, like of all the places that I
00:51:51
would want to work with. I wanted to work with the Wall
00:51:53
Street Journal because I really view all of these topics as
00:51:57
bipartisan. You know, they're not left,
00:51:59
they're not right. They're like basic rules of the
00:52:02
road. And I knew that if the reporting
00:52:04
had come from the New York Times that there would be a large
00:52:08
swath of right leaning voters that would never be able to
00:52:10
trust it. But if it came from the Wall
00:52:12
Street Journal, it was likely that they would be able to at
00:52:16
least consider it. And I think that's part of why
00:52:19
the Senate hearing was so bipartisan because.
00:52:22
It was something that came out of, you know, it was trusted
00:52:25
like that. A lot of the things that were
00:52:27
said in those articles sounded crazy.
00:52:29
They sounded super. There's like, no way, there's no
00:52:32
way the company could be this bad, like the human trafficking
00:52:34
thing of the Facebook employees being worried about offending
00:52:38
buyers over the people being trafficked.
00:52:41
But because it came from a very, you know, center of the road,
00:52:45
cautious publication, people were like, oh, maybe this is
00:52:48
actually true. And yeah, I mean, I think this
00:52:52
journal is so credible and sort of careful and how they present
00:52:56
things. Initially, you were not going to
00:53:00
come out, right? Like what motivated your
00:53:02
decision to go public? So wondering why the disclosures
00:53:06
are so large. So I want them to be able to
00:53:07
stand on their own like I always expected to do.
00:53:10
Like closed door briefings but like governments to be able to
00:53:13
explain them. But I never intended to be like
00:53:16
part of the story. And right before I came out, so
00:53:20
maybe a couple months before, my lawyers started really putting
00:53:24
pressure on me where they were saying, you know, Facebook knows
00:53:27
it's and they can look at all the different documents, right?
00:53:30
They know it's you. You know, the report is going to
00:53:32
get started. They're going to figure out real
00:53:33
fast or as soon as they start going to ask for comment,
00:53:35
they're going to figure out real fast.
00:53:37
And the Wall Street Journal, they said here's the deal.
00:53:43
Like either you can wait for the rest of your life for Facebook
00:53:47
to present you to the public. You know, every day you're like
00:53:51
open, you know Google News. You're going to open your
00:53:53
newspaper and be like, is today the day that Facebook is going
00:53:56
to introduce me in the worst possible way?
00:53:58
Or you can take responsibility for what you did.
00:54:01
You know, I know that you don't want to be out.
00:54:03
You don't want people looking at you.
00:54:04
You don't want to be out there. Like, you know, go and take
00:54:07
responsibility because if you do, close door briefings.
00:54:11
Every single briefing you do will expand the circle of trust,
00:54:15
and you're going to be like the juiciest story for some reporter
00:54:19
to break. Like if they can.
00:54:20
Find post or somebody would have just.
00:54:23
And one of my friends joked, like, after I came out, they
00:54:26
were like, I don't understand how this story stayed quiet for
00:54:29
so long. I thought, everyone in San
00:54:31
Francisco, I knew Francis was doing this.
00:54:33
And I was like, everyone in our circle of friends knew I was
00:54:35
doing this. Not everyone in San Francisco.
00:54:37
But still, it's one of those things, like, someone brags to
00:54:40
their friend about how I know where the Facebook whistleblower
00:54:42
is and, you know, I just goes. And so I decided to step out
00:54:48
into the light. And it's actually been a really
00:54:50
transformative experience. You know, it's one of these
00:54:53
things where I spent a lot of my life, like, really trying to
00:54:57
avoid being seen. You know, like, I've gotten
00:54:59
married twice. I eloped both times.
00:55:02
The idea of standing in front of a crowd, having them just, like,
00:55:05
stare at me for my wedding sounds very stressful.
00:55:07
You know, I've had two birthday parties in the last, like, 20
00:55:09
years. Like, it's just not my jail.
00:55:11
But it's been really interesting having to, like, stand up and
00:55:14
try to educate people on these issues.
00:55:17
Because, you know, I really believe in democracy and the way
00:55:21
the world changes is we change it.
00:55:24
You know, we get out there and we say, hey, you know, I'm going
00:55:27
to keep repeating this until I see something different in the
00:55:29
world. And having to show up in my own
00:55:31
life has been a huge blessing. And I, something I never
00:55:35
expected to be one of the things that I would walk away, you
00:55:37
know, two years later and say I'm so grateful for this.
00:55:41
Awesome. That is a great ending to the
00:55:43
podcast. Thank you so much for coming on
00:55:45
and I really appreciate talking to.
00:55:47
You. My pleasure.
00:55:49
The honest thing I leave people with is if you think all these
00:55:52
things are too complicated, you know, like these are, you know I
00:55:55
couldn't understand a technical book.
00:55:57
The whole point of power of 1 is that democracy requires an
00:56:01
informed population. So it is written so that if you
00:56:04
could follow along today, I guarantee you'll be entertained
00:56:08
and follow along the power of 1. And there's lots of fun stories
00:56:11
and, you know, crazy hijinks along the way.
00:56:13
Because I've always got into trouble.
00:56:15
You know what? I didn't mean to.
00:56:16
And so I hope you come along on this journey too.
00:56:19
Thank you so much. That's our episode.
00:56:21
Thanks so much to Tommy Herron, our audio editor.
00:56:23
I want to shout out. Annie Wen, our intern for the
00:56:26
summer, has been helping me prep for the podcast and working on
00:56:30
punching up the show. Shout out to Riley Kinsella, my
00:56:33
chief of staff, and young Chomsky for the theme music.
00:56:37
This has been the newcomer podcast.
00:56:39
Please like comments, subscribe on YouTube, give us a review on
00:56:43
Apple Podcasts, and most importantly, most important,
00:56:46
subscribe to the Sub Stack newcomer.co Become a paid
00:56:50
subscriber today. It makes this all possible.
00:56:53
Thank you so much. See you next week.
00:56:56
Goodbye. Goodbye.
00:56:57
Goodbye. Goodbye.
00:56:58
Goodbye. Goodbye.
00:56:59
Goodbye, Goodbye, Goodbye.
