The Musk Bubble in Tech Is Going to Pop (w/Alex Stamos)
Newcomer PodDecember 14, 202201:12:1749.64 MB

The Musk Bubble in Tech Is Going to Pop (w/Alex Stamos)

This past Thursday Elon Musk accused Alex Stamos, Facebook’s former chief security officer and the director of the Stanford Internet Observatory, of running a “propaganda platform.”

That’s the sort of upside down thinking we’ve come to expect from Musk, given Stamos is one of the most fair-minded and serious thinkers about content moderation and social media platforms today. So, on Friday, we had Stamos on the Dead Cat podcast to talk about Musk’s choreographed leaks about the old guard at Twitter.

Last time Stamos came on the Dead Cat podcast, he explained why the media had underplayed its own culpability in enabling Russian disinformation while playing up Facebook’s failures.

This time, Stamos helped Dead Cat co-host Tom Dotan and I do our best to steel man Matt Taibbi and Bari Weiss’s purported exposés on Jack Dorsey’s Twitter, though none of us were particularly impressed by the seriousness of their reporting.

Taibbi documented Twitter’s handling of the Hunter Biden laptop story. And Weiss reported on Twitter’s decision to ban Trump after the January 6 insurrection attempt. (Since we recorded on Friday, the gang has continued to pump out new iterations of the Twitter Files.)

Stamos argued that the real essence of the critique that emerged from Taibbi’s reporting was the reality that governments across the world are actively trying to shape social media companies’ content moderation decisions.

“Government interference on platforms is a real deal,” Stamos said.

However, Taibbi didn’t show that in his tweetstorm. Joe Biden wasn’t in office, let alone the White House, when Twitter decided to briefly censor the New York Post’s story about the Hunter Biden laptop.

“Musk giving very selective data to a couple of very politically biased journalists is not the kind of transparency we would need if we wanted to be confident that there was no interference on this platform by government,” Stamos said.

“The kernel of truth is that every government on the planet, including ours in the United States of America, is trying to manipulate Twitter and all of the other major platforms. And so I proposed — here are things you can do if you’re Musk: You can have an open database of moderation decisions,” Stamos said. “You can commit to releasing — instead of just releasing emails from the Biden team, the DNC, not government actors — you could release all communications from all government actors globally. So the Modi campaign, the Indian government. I think it’s really important for somebody whose net worth is tied up in China, like Musk. Communications with the Chinese Communist Party is the kind of thing that if you really cared about this, you could release. But Musk apparently doesn’t like that idea.”

On the podcast, we talked about Musk’s extreme exposure to China and Musk’s unwillingness to criticize the Chinese government. (Even as Musk has been vocally critical of the United States’ response to the pandemic, he hasn’t said anything about China’s far more restrictive Covid lockdowns.)

“Tesla has a massive giga factory in Shanghai from which they will be producing cars to be shipped around the world,” Stamos said. “China is already 25% of the revenue. Clearly, if you look at discounted cash flows for the future, China’s going to be more than 50% of their stock price. And so yes, the [People’s Republic of China] has huge leverage over him. It’s like if Mark Zuckerberg, instead of having all his money in Facebook stock, his money was in a Chinese pharmaceutical company.”

Ultimately, Stamos predicted that the Musk fever in Silicon Valley would break. But for the time being, he said, Musk was playing the same role that Trump did at many Thanksgivings across America — dividing friends and families.

“I’m putting down a marker now. The Musk bubble in tech is going to pop,” Stamos told us. “Right now, it has become trendy to be seen as counterculture to be seen on Team Musk. And there’s a bunch of people who I used to work with, people who I’ve interacted with socially, who are smart, serious people who are now kind of waving the Musk flag. And just like with Trump. Just like with FTX. That bubble is going to pop. And you’re going to see all these people all of a sudden try to rewrite history.”

Stamos argued that Musk is unraveling before our very eyes. Tesla’s stock is down 60% this year while most of the media attention has been focused on Twitter.

“Musk is accelerating his breakdown here,” Stamos said. “If you end up with Twitter going out of business. Him having to give up Twitter to the bondholders or the debt holders. If you see him having to step down as CEO of Tesla. If you see some kind of massive moment — or you know, if there’s a horrible violent act that happens publicly. If there is, God forbid, something of the level of a Christchurch shooting or something that gets attached back to Musk’s moderation decisions — all of a sudden all these people who thought it was real cool to be on Team Musk are going to reverse themselves. And so I hope people are taking screenshots because it’s just a very scary. There’s a scary impulse in the Valley right now.”

Give it a listen

Read the automated transcript



Get full access to Newcomer at www.newcomer.co/subscribe

00:00:05
Welcome, silicon salad. Hello everybody.

00:00:13
Welcome to Dead cat Tom dote on from Insider joined by Eric

00:00:17
newcomer newcomer thanks for joining us in the chat today so

00:00:20
today we're going to do it. We're going to talk about

00:00:23
Twitter content moderation and the internal documents released

00:00:26
by Elon Musk maybe is the only reason he bought Twitter, it's

00:00:29
not Clear. So we're going to talk about how

00:00:32
content moderation is done, and we're going to like it.

00:00:35
So joining us to talk us through.

00:00:36
This is Alex Stamos for mercy. So at Facebook, he is also the

00:00:40
director of the Stanford internet Observatory.

00:00:44
And he is now, apparently, an Elon Musk Nemesis where he runs

00:00:47
a propaganda platform so says Ilan.

00:00:50
But first of all, Alex, welcome back to dead cat.

00:00:52
Thank you so much for joining us to talk about all this.

00:00:54
Thanks for having me guys. Returning, guest always loved

00:00:57
it. Yeah.

00:00:58
Okay. So we're going to need a jacket.

00:00:59
How many? I have to do this before, I get

00:01:01
that the Masters jacket, you're behind Aaron Griffith.

00:01:04
And who else is? Like attack petty shit like

00:01:05
that. Yeah, so whoever gets the five

00:01:07
first definitely get some swag. So we're going to do this, we're

00:01:10
going to do it in good faith and I'm even, I'm even going to call

00:01:13
it the Twitter files which is what Elon wants us to call it.

00:01:16
So basically, let me just summarize what has happened.

00:01:18
So far there have been two stories.

00:01:21
All of which have been posted on Twitter, which as a side point,

00:01:24
I think is really lame as a journalist, these stories exist

00:01:26
entirely on Twitter, and it's all because Elon is controlling

00:01:29
these documents. And I think, as part of his, you

00:01:31
know. Yeah, yeah.

00:01:33
With internal agreed to publish them, on Twitter.

00:01:35
Yeah, but these are good journalist independent

00:01:37
journalists who have done it but I think that particular

00:01:39
concession is lame. But anyway, here are the two

00:01:42
tweet storms that have come out so far.

00:01:44
The first one is from Matt, Taibbi, who is a sub stacker

00:01:47
and, you know, long time investigative journalist and

00:01:50
he's writing about Hunter Biden and in the tweets and, you know,

00:01:54
from the internal documents, he showed the internal

00:01:56
deliberations around, pulling down the content that came from

00:01:59
Hunter. Laptop that were initially leaks

00:02:02
in your post via Rudy Giuliani. Some of this word deliberations,

00:02:06
insights, Witter, about the decision to block all the links

00:02:09
from the New York Post, which was a pretty big deal at the

00:02:12
time. And I still think kind of crazy

00:02:15
and some of it was just about pulling down pictures of Hunter,

00:02:17
Biden stick. So that's why we didn't make

00:02:20
that clear, but yeah, yeah. So so part one was all about

00:02:23
Hunter Biden and that whole story that turned out not to

00:02:26
push the election towards Trump and in the second one, which

00:02:29
came out yesterday. Day or last week, when you guys

00:02:31
all hear this, that one is from Barry Weiss also at sub stack

00:02:35
and she's dealing with the topic of Shadow Banning which is a

00:02:38
basically the curtailing of the reach of certain users and her

00:02:42
tweets. She showed that Twitter had

00:02:44
flagged and D boosted certain accounts like Dan, bungee know a

00:02:48
bunch, I know. And once you know that guy a

00:02:51
Stanford Professor, Jay about a Chara none of us know which

00:02:55
speaks where it's your bar. I just don't spend enough time

00:02:58
on Facebook to get the full bunch.

00:03:00
No, George. I know, it's so.

00:03:02
So then they're also a Stanford, Professor Jay bhattacharya.

00:03:05
Who had some anti lock down content.

00:03:07
It appeared that he had also been D boosted in certain ways

00:03:10
and then there is high righteous akka lives of tick tock and it

00:03:14
showed that all of these accounts have been flagged and

00:03:16
in some way, kind of D distributed in a way that they

00:03:20
had been expecting removed from trending topics and search.

00:03:24
And then also in Barry's tweets, there were internal debates

00:03:28
about the use of spam enforcement policies.

00:03:30
As a trust and safety cudgel. And there's also this question

00:03:33
about whether Twitter was being genuine in.

00:03:36
Its previous claims that, it did not Shadow ban people, which

00:03:39
then of course, gets into very exciting semantics of what

00:03:42
shadow Banning means. All right.

00:03:44
So we want to bend over backwards, to be fair to these

00:03:48
people. I think that's all all our

00:03:49
inclination, Alex. I've seen you trying to steal,

00:03:51
man. The argument of these tweets

00:03:53
storms, is there a piece out of it that you would say, is the

00:03:57
best critique or The most substantive piece to come out of

00:04:03
these to tweet storms in your view.

00:04:06
So I feel like the first tweet storm, the tabi one which

00:04:10
focused on specifically the hunter Biden situation has a

00:04:15
kernel of Truth in the recognition that there is a

00:04:17
problem with government and political interference on these

00:04:22
big platforms. The truth is, when you work at

00:04:24
Wendy's companies, everybody is constantly trying to work the

00:04:27
rafts to get you to change their content.

00:04:29
Moderation, Strategies in a way, they like, everybody wants their

00:04:32
enemies taken down. And everybody wants their

00:04:35
political friends, and fellow travelers to stay up, no matter

00:04:39
what they do. And so that is a problem that

00:04:42
companies are always dealing with, and it is true that if you

00:04:45
have a officials in the US, government pushing for content

00:04:49
to be taken down, then that would be in the United States.

00:04:52
Possibly a First Amendment violation in the specific

00:04:56
hundred biting case. There's absolutely no evidence

00:04:59
in his threat. That anybody who is a actual

00:05:02
employee of the US government and therefore covered by the

00:05:04
First Amendment did anything. The one example was the Biden

00:05:08
team that they could find saying, take down these nude

00:05:11
pictures of Hunter, they didn't even say anything about the New

00:05:14
York Post story itself was subsequent tweets that had

00:05:18
specific nude videos and photos, which Twitter has a policy that

00:05:22
you cannot put naked photos up on somebody without their

00:05:24
permission. We call this ncii in the

00:05:27
industry non-consensual intimate imagery but your Especially

00:05:30
revenge porn, right? Revenge porn is like the term

00:05:33
that people use. We try not to use in the

00:05:35
industry because it Revenge implies that the victim did

00:05:37
something wrong. The majority of the time it is

00:05:39
not the 40 year old son of a presidential candidate.

00:05:42
It is a 19 year old girl who ran whose boyfriend did something

00:05:46
bad, right? So like we try not to use

00:05:47
revenge porn because the median victim here is absolutely

00:05:50
innocent of doing anything. And in terms of the government,

00:05:53
meddling to take something down. A lot of people forgot that

00:05:57
Trump was in the White House. When The Biden team was asking

00:06:01
us to these. There were people just saying

00:06:03
the Biden Administration, when it's like the Trump

00:06:06
Administration was in office. So there isn't even a question

00:06:10
there of whether, you know, the right people within Biden world

00:06:13
were making this question or not.

00:06:15
There were no idea was unemployed history of time,

00:06:18
right? The DMC is not a government

00:06:20
actor. There's no First Amendment

00:06:21
analysis. That covers the dnc's actions

00:06:25
and tidy, admits that, he could not find any evidence of the

00:06:30
Kermit being involved and then also hints towards that.

00:06:32
There were emails from the Trump Administration which actually

00:06:36
could be a First Amendment violation.

00:06:37
We want does not address any of those up, right?

00:06:39
So if we want to steal me on this, if we wanted opposite

00:06:42
strawman, if we want to take it seriously, government

00:06:44
interference and platforms is a real deal.

00:06:47
But Taibbi, one did not show that and to specifically ignore

00:06:51
the possibility of actual government interference and musk

00:06:55
giving very selective data to a couple of very Medically biased

00:07:01
journalists is not the kind of transparency, we would need if

00:07:03
we want to be confident that there was not interference on

00:07:07
this platform by government. All right, so tweet storm one,

00:07:10
pretty weak. I think that's sort of its weak,

00:07:13
but there's a fundamental, I think there is a real kernel of

00:07:16
Truth there. And from my perspective, the

00:07:18
kernel of Truth, is that every government on the planet

00:07:21
including ours, in the United States of America, is trying to

00:07:23
manipulate Twitter and all of the other major platforms.

00:07:26
And so, I proposed here things you can do, if your musk, you

00:07:29
could have an open Open database of moderation decisions.

00:07:31
There are some interesting, Privacy Law issues there, but

00:07:34
you can work around those. You can commit to releasing you

00:07:38
instead of just releasing emails from the Biden and team, the DNC

00:07:42
not government actors. You can release all

00:07:44
Communications, from all government actors globally.

00:07:47
So the Modi campaign, the Indian government, I think really

00:07:51
important for somebody whose net worth is tied up in China, like

00:07:55
musk Communications, with the Chinese Communist party is the

00:07:58
kind of thing that if Really cared about this, you could

00:08:01
release but musk apparently doesn't like that idea, right?

00:08:04
We want to talk about that more. Certainly, what?

00:08:06
We still on this topic, what you're saying is maybe the most

00:08:08
legitimate criticism or insightful Revelation from the

00:08:12
Taibbi thread. When you were at Facebook and

00:08:15
you were probably in the middle of a lot of these discussions,

00:08:17
generally, do you think internet platforms handle it?

00:08:19
Well, we have the internal discussions or at least some of

00:08:22
the internal discussions around what was happening around Hunter

00:08:25
Biden. It all seems fairly chaotic.

00:08:27
It feels a little bit arbitrary. And you know, there is A certain

00:08:30
amount of well I guess within the arbitrariness people's

00:08:33
inherent biases and maybe what they would prefer to see happen

00:08:36
if they are you know donors to the Democratic party or

00:08:39
whatever, informing some of their decisions.

00:08:41
So like generally how good of an actor do you think the platforms

00:08:44
are in these situations and is there anything from the Twitter

00:08:47
specifically that you thought that kind of stock a little bit?

00:08:51
So in this specific situation and parallel Facebook made a

00:08:55
lesser mistake, but I think Twitter and Facebook did make a

00:08:58
mistake here and that They took on responsibility for something

00:09:02
that should not be their responsibility.

00:09:04
If we go back to 2016, they're two totally different propaganda

00:09:08
campaigns by Russia and Russian Affiliated groups against the

00:09:12
campaign. There is the campaign on the

00:09:15
platforms which was mostly private actors in Russia,

00:09:18
internet research, agency and other firms owned by.

00:09:21
You've any provision mostly in which they were trolling on the

00:09:24
platforms that I think is absolutely responsibility.

00:09:28
The company's Facebook and Twitter should not allow a

00:09:30
handful of people in a building st.

00:09:32
Petersburg to run a thousand accounts that pretend to be Real

00:09:35
Americans and get hundreds of millions of views that should

00:09:38
not be allowed, but it was relatively small next.

00:09:42
The next one you're going to say, right, but that was a

00:09:44
relatively small impact, I think versus the hacken Lee Campaign

00:09:47
which was actually the government itself, Gru Russian,

00:09:50
military intelligence break into DNC breaking to jump addresses,

00:09:53
email, bunch of other people's email and then leaked

00:09:56
information to change the overall media environment.

00:09:59
Now, Target of that. While there are online

00:10:01
components here. The target of that is the US

00:10:04
media and in the reaction to 2016 and with all the pressure,

00:10:08
as you know, we have all discussed about.

00:10:11
This was the tickets go back. And listen, we dug into this.

00:10:14
Yeah. Well, we dug into this.

00:10:16
I think it's a little unfair to completely blame social media

00:10:18
companies for Trump, but that is effectively.

00:10:21
I think the feeling that the company's was the entire Center

00:10:24
left media, and I think that there is a kernel of Truth in

00:10:27
all of this criticism of that is That we kind of the New York

00:10:30
Times consensus is Facebook created Trump and if you push

00:10:34
companies that they are responsible for something like

00:10:35
that, then maybe they'll take on responsibility, did it.

00:10:38
And I do think the company should not have taken on

00:10:40
responsibility for that second class that hacking leak because

00:10:43
basically everybody else primed from the last election, we're

00:10:48
sort of late-breaking information, helped tilt things

00:10:52
for Trump and social media. Companies have been blamed for

00:10:57
fueling pro-trump message. So then we get towards The end

00:11:01
of the second Trump election and all of a sudden we have this New

00:11:04
York Post story about Divine laptop.

00:11:06
It might be hacking. Leaked material.

00:11:09
And so Twitter, decides, I think we all think that's wrong now

00:11:13
but Twitter, decides for reasons that we can all understand to

00:11:17
try and block the story. Now, the story still rent,

00:11:20
plenty wide. It went wider, right?

00:11:22
There's a massive Streisand effect here, right?

00:11:24
Where? Because of Twitter's action, and

00:11:26
to be fair, Facebook also did a thing where they Kind of down

00:11:30
rigged to a little bit. So it didn't show up and

00:11:32
recommendations and such until is fact checked and then they

00:11:34
release that as well, but it was allowed to be posted on

00:11:37
Facebook. But because of that action,

00:11:39
there's a massive Streisand effect and people pay way more

00:11:41
attention to it and it completely dominates.

00:11:43
The discussion in the last week's of the campaign and

00:11:45
Democrats were somewhat dishonest about this whole thing

00:11:49
and try to make it seem. Like the laptop was like,

00:11:52
probably fake when it seems like clearly.

00:11:55
It was truly a core russian, right?

00:11:56
And I think that figured also into, I imagine some of faith.

00:12:00
It's distorted. Sorry.

00:12:00
Twitter's decision making was that we don't want to be seen

00:12:03
disseminating. Something that could have been,

00:12:05
you know, Russian and origin, or have some kind of Russian

00:12:07
involvement and we've got a retread of everything in 2016,

00:12:10
right. What any, I think these

00:12:12
companies did not want to spend another four years of being

00:12:14
blamed for being probably a. But I think this is where

00:12:17
Twitter did not have the fortitude to stand up and say,

00:12:20
look, if the New York Post, publishes something that is

00:12:23
hacked, or obtained illegally that is on them, right?

00:12:27
Like I don't think that Facebook and Twitter should substitute

00:12:30
their editorial discretion for the editorial discretion of

00:12:33
journalistic outlets. Even as those journalistic

00:12:35
outfits, don't have necessarily great ethics, right?

00:12:37
But I do understand how they got here because it was incredibly

00:12:39
sketchy, right? Like the post is the only outlet

00:12:42
that has it, right? They do not share the hard drive

00:12:44
with anybody else. So, no other journalists, I

00:12:45
mean, the entire journalism world had this huge problem of

00:12:48
how do you cover this? When you cannot authenticate,

00:12:51
the documents or authenticate? The drive, it took months and

00:12:54
months and months for the Washington Post to get a copy

00:12:56
later. The Washington Post had

00:12:58
forensic, experts who I really trust.

00:13:00
Get it. And they find that the hard

00:13:01
drive has been modified, but none of this comes out in that

00:13:05
time. The other media Outlets are know

00:13:07
how to handle it. Twitter doesn't know again.

00:13:08
I think Twitter made a mistake. And so, if you want to say the

00:13:11
outcome of this whole thing is Twitter should not take

00:13:12
responsibility for stuff. That might be a hacking leak.

00:13:15
I think that's true because just, this is just the reality

00:13:18
of living in a free Society is we have a free media.

00:13:22
We do not have a national secrets, we don't have like a

00:13:24
Secrets act and official Secrets act, there are free societies

00:13:28
that say before an election. Section you can't release a

00:13:31
bunch of like traumatic new information and even if the

00:13:35
federal government doesn't have that principle, it's not crazy.

00:13:39
That social media companies would say listen we're not doing

00:13:43
a great job, sorting truth from falsity and we don't invest

00:13:47
enough in it were bad at it. We don't want to be the vehicle

00:13:51
for a bunch of false information right before the election.

00:13:54
So we're basically going to say we don't moderate enough, we

00:13:56
don't want to create this sort of like chaos, right?

00:14:00
Election like that to me would be a reasonable stance.

00:14:02
I don't know if it's the one. I would have to know what you're

00:14:04
talking about. Are they taking down a link from

00:14:07
The New York Times or the Wall Street Journal, or the what New

00:14:09
York Post that? That's where the problem is.

00:14:12
There's a difference between them.

00:14:13
Being responsible for the organic content that is on their

00:14:16
platform where people are using. If you're on Facebook and you

00:14:19
say, I'm Joe, Schmo actually live in Wyoming and you really

00:14:23
live in Beijing. That Facebook has some

00:14:25
responsibility there. If somebody's posting the link

00:14:28
from the Washington Times, Is that I think the Washington

00:14:30
Times is responsible for that and let you write other

00:14:33
societies. Have this in France, there's a

00:14:35
news blackout. I believe it's 48 hours before

00:14:37
the election. In this exact thing came up and

00:14:39
that there was a Russian campaign called mccrum leaks,

00:14:43
where they had real stolen documents and then they inserted

00:14:45
fake documents into the real ones to try to confuse people.

00:14:49
And they released it hours before the deadline because

00:14:52
their goal was to get online sources to cover the

00:14:56
quote-unquote macron leaks and then not allow the mainstream To

00:15:00
rebut it failed in France, right?

00:15:03
But we don't have anything like you will never have anything

00:15:05
like that in the US. And so as long as the poet, you

00:15:07
know, there's not a rule around the actual media.

00:15:09
I don't think Facebook or Twitter, can invent that rule

00:15:11
themselves, I want to move on to the very wise thing soon, but

00:15:14
like to structure the hunter Biden laptop argument, I think

00:15:19
the liberal position would be okay, maybe Twitter, mishandled

00:15:22
it, but it doesn't undermine all top down decision making.

00:15:26
They should still have processes for deciding what to put out.

00:15:29
Out. And then there's sort of a

00:15:30
conservative position, which is just look how much the experts

00:15:34
fail. Whether it's the laptop, whether

00:15:36
it's covid. Disinformation, every time you

00:15:39
have a top-down censorship model, there are major failures

00:15:43
and so we should insert of abandon, the exercise

00:15:46
altogether. So that is like musk kind of

00:15:51
vibrates between these two states of there should be no

00:15:55
content, moderation and then we're going to do lots of

00:15:57
content, moderation. But based upon Opinion because

00:16:01
the truth is is he's he's really wrapped around the axle by like

00:16:04
three or four decisions, the hunter by and laptop, the D

00:16:07
platforming of Babylon. Be lives of tick-tock, probably

00:16:10
lives of tip, talk. Those are like for things that

00:16:14
are very public, that happened in the United States.

00:16:18
They do not represent 99.99% of what you have to do, every day

00:16:22
to keep a platform like Twitter actually useful, right?

00:16:25
And that is where things are starting to fall apart at

00:16:28
Twitter, is that Their basic ability to stop spam.

00:16:31
For example, is really getting bad, we're actually publishing a

00:16:35
blog post, the next couple of days on this on in China like

00:16:39
knowledge about the protests in China have been buried by spam

00:16:42
and it looks like it might not really be the Chinese government

00:16:44
looks like it's just spammers take advantage of the fact that

00:16:47
there's almost no anti spam team left at Twitter now, right?

00:16:49
Like and so if you decide we're just not going to do content

00:16:52
moderation at all, you will end up with eight Chan.

00:16:54
You'll end up with something that is unusable at the scale at

00:16:57
which Twitter wants to operate. Yeah.

00:16:59
And what, I will link that blog post in the episode description

00:17:01
because there's been a lot of talk about that last thing on

00:17:04
this side. Before we move on to the Barry

00:17:06
Weiss thread in terms of New Revelations that came out of

00:17:09
what Taibbi had posted, it seems fairly thin to me.

00:17:13
I mean, even on the topic of, you know, not linking store not

00:17:17
allowing links to the New York Post story, I believe Jack even

00:17:20
came out while he was CEO of Twitter to say in hindsight,

00:17:23
that was an extreme decision. That was the wrong decision.

00:17:26
There's already been some level of Maiya culpa on the part of

00:17:29
Twitter leadership. To say, we don't agree with the

00:17:31
way this played out in the end. We would take a different tactic

00:17:34
that happen this way. And so the idea of this was a

00:17:36
huge gotcha, or at least some kind of like confirmation of

00:17:40
Suspicion on the part of conservatives, and Elon Musk.

00:17:43
I don't think it was really their right?

00:17:44
Anyone who's been following us, right?

00:17:46
Nothing he said was different than what you all Roth said on

00:17:49
stage with Kara Swisher, right? You well who was in charge of

00:17:52
the trust and safety team. Straight up said we made a

00:17:55
mistake. Here's how the mistake was made.

00:17:57
This is what I do differently and everything.

00:18:00
Tell you be posted back that up. And most importantly, Taibbi

00:18:04
said, there's no evidence of government intervention in

00:18:07
Hunter by his laptop. And there is evidence of the

00:18:09
Trump campaign sending other stuff.

00:18:11
And then just kind of left that dangling out there, but does it

00:18:13
matter. Because what's happening is,

00:18:14
people are framing it up and saying this proves something,

00:18:18
and having all this anger and rage, you know, all those

00:18:22
liberals at Twitter without any real evidence.

00:18:25
It's honestly, I mean, I used to be a big tiny beef and back in,

00:18:28
like, the Rolling Stone day. I I loved the phrase.

00:18:30
The like you know this vampire squid on the face empire squid

00:18:34
face of the human about Goldman Sachs.

00:18:35
Yeah right and I mean obviously he should be pretty embarrassed

00:18:39
by this. I don't see how you come back.

00:18:41
I mean one the fact that he's like doing all this work on

00:18:43
behalf of this, incredibly rich and powerful person, I feel like

00:18:46
it's incompatible. It's like, it's in the musical

00:18:48
for Tucker Carlson just to get mad.

00:18:50
I mean, it's sort of feels like all a pretense.

00:18:52
There's no evidence to back what he's saying?

00:18:54
It's crazy. It's just like it from a basic

00:18:56
journalistic perspective. You can't have like, 60 tweets

00:18:59
that are all brethren. Worthless.

00:19:00
And then the actual evidence you show doesn't demonstrate easier

00:19:03
than what the governor. He didn't see any evidence that

00:19:06
the government interfered. Even though everybody around him

00:19:09
wants, basically keeps suggesting that it did and he

00:19:12
does nothing to clean up that record at all.

00:19:15
No, no. He in fact, his thread is

00:19:17
inconsistent on this, right? Of like making claims that he

00:19:21
can't back up later in the thread.

00:19:22
Let's talk about Barry. So, this one, tell us a little

00:19:25
bit meteor to me, in terms of showing off, What could arguably

00:19:30
be a disingenuous stance on the part of Twitter when it comes to

00:19:34
Shadow Banning? So like I kind of summarize at

00:19:37
the outset there were internal looks like screenshots really of

00:19:41
certain accounts you mentioned lives of tick-tock events.

00:19:43
Bungee know we mentioned J bought a Chara from Stanford

00:19:47
where it did show that they had tags affixed to their accounts

00:19:50
saying they were to be penalized, eboost, 2, whatever

00:19:53
term you want to use and not made as you know, their content

00:19:56
not made as distributed. I mean, I don't No, what do you

00:19:59
think about everything that came through berries tweets?

00:20:02
So I think the, if we're once again going to take the best

00:20:06
possible version their commitment, I think this is a

00:20:09
situation where Twitter's Executives were not very

00:20:11
specific in their language, right?

00:20:13
Everything she is talking about is both in the terms of service

00:20:16
and I've been in multiple blog post on Twitter that Twitter

00:20:19
will allow certain speech to exist, but we'll decide that

00:20:22
they will not amplify it themselves.

00:20:24
I think this is a good thing. In fact you know who agrees with

00:20:26
me on that? Yeah, I'm a guy named you.

00:20:29
Musk agrees with me and that he specifically talked about

00:20:33
freedom of reach is not freedom of speech, right?

00:20:35
So that is a, that is a paraphrase of my colleague,

00:20:37
Grenada, resta of what people have talked about, for years,

00:20:40
which is a middle ground. For these platforms is allowing

00:20:43
certain speech to exist, and to be findable by people who

00:20:46
specifically search it out. But not to use the features of

00:20:50
the platform that recommend stuff or provide amplification

00:20:53
to push that stuff out, right? That basically Twitter is saying

00:20:56
there's some Middle Ground if you're Really, really bad over

00:20:59
here. You're just off if your speech

00:21:01
is just find your over here and we're going to have to deal with

00:21:03
the middle ground where we will allow you to exist.

00:21:06
But we're not going to have a recommend you and we're not

00:21:08
going to push you on other people and that is effectively

00:21:11
what these different settings do is that people who are multiple

00:21:14
repeat offenders that instead of taking them completely off, they

00:21:18
allow them to exist. They allow their followers to

00:21:20
see all their content but they don't recommend them.

00:21:23
Just like some of the debate on the very wise threat is

00:21:26
semantic. It is what Is all semantics.

00:21:30
What is Shadow Banning? Because Twitter basically said

00:21:33
we don't Shadow band but then Twitter defined Shadow Banning

00:21:35
as no one but you can see what you do.

00:21:39
And other people clearly. See Shadow Banning is I'm not

00:21:42
getting as much reaches. I thought I should.

00:21:44
But Twitter was very clear that they were fiddling with reach of

00:21:48
accounts that violated their policies.

00:21:51
So it feels like there's just like a total semantic war going

00:21:54
on here, right? There's a 20-18 blog post, where

00:21:57
the people in charge at Twitter said, Ed, what is Shadow?

00:22:00
Banning the best definition we found, is deliberately making

00:22:02
someone's content undiscoverable to everyone except the person

00:22:05
who posted it, unbeknownst the original poster, right?

00:22:07
So they are taking the most extreme version of what people

00:22:11
talk about Shadow Banning, where somebody's effectively in a

00:22:14
complete jail where they think they're on the platform but

00:22:17
nobody sees it, right? They then say in that post, what

00:22:20
we do do is we take people out a search and we down rank stuff

00:22:23
and so, yes, it is all done on recovery.

00:22:26
Let's get into the definition of that.

00:22:27
I mean down rank. He means if you're looking at

00:22:30
trending topics, it won't appear.

00:22:32
If you have the algorithmic feed that puts in accounts that you

00:22:35
don't normally follow, but it just wants to highlight

00:22:37
something of, you know, Trend and excitement and people are

00:22:41
talking about. They'll put that in there even

00:22:42
if you don't follow it. So that is like a kind of

00:22:44
algorithmic boosting that Twitter engages with that.

00:22:47
They basically push these people out of, right?

00:22:49
They effectively say the places where we are putting content to

00:22:53
you. That you did not explicitly ask

00:22:55
for, we consider that our editorial responsibility, and we

00:22:57
will make editorial Visions for than that.

00:22:59
And that includes both the algorithmic feed.

00:23:01
Like you said, as well as trending topics and other kinds

00:23:03
of recommended you should follow this account because there's

00:23:05
three or four interfaces at Twitter that will recommend

00:23:08
content to you. I'm sort of conflicted.

00:23:11
There's sort of the covid case study where Twitter seems to be

00:23:15
going after somebody for saying oh covid.

00:23:17
Lockdowns Are Gonna Hurt our children because learning loss

00:23:22
or whatever, with the benefit of hindsight, to me does seem like

00:23:26
some you know censorship around this sort of Veiling ideology.

00:23:30
On the other hand, down voting or whatever, they did to lives

00:23:34
of tick-tock. I don't know if I ran a

00:23:35
platform. I did all this hard work.

00:23:37
I founded, like tech company. I built it up and I'm like,

00:23:41
Distributing it to the world. I don't know if I'd want to look

00:23:43
back at my work product and see that like this account, like

00:23:47
harassing sort of the most marginal people in our society

00:23:51
and just sort of making fun of them.

00:23:53
Even if it doesn't, violate sort of, I don't know, Red Line rule

00:23:56
as I Can Dream up. Why am I Using this great tool

00:24:00
that I created to distribute that.

00:24:01
Like what's my obligation which is white this kind of feature

00:24:04
exists to allow that account to exist.

00:24:07
Like if you want it to be able to be there then you need to

00:24:11
have some kind of mechanism where you're not making things

00:24:13
worse. And the thing goes have to

00:24:15
remember that is that all of trust and safety is adversarial.

00:24:18
Any decision you make, the person who you're making

00:24:21
decisions about their content will adjust.

00:24:24
And what has happened is you have these effectively, these

00:24:27
kind of professional. Trolls.

00:24:30
Like legit Tick-Tock who understand exactly where the

00:24:32
line is and they'll go up to a millimeter of that line and the

00:24:35
outcome is Children's Hospital's get bomb threats, right?

00:24:39
And so, if your Twitter interact, the real-life impact

00:24:42
of this account is that people are getting death threats but

00:24:46
they are super careful. Not to violate one, you might

00:24:48
just make it go away totally and see like we're just going to

00:24:51
take this risk or you might do something like this.

00:24:54
We're like okay you can exist but we're not making it, we're

00:24:56
not going to try it, you know, or whatever.

00:24:58
Right. Let's go.

00:24:58
The argument a number of people are making is that lipstick cock

00:25:00
probably got actually a good deal here and that they violated

00:25:03
multiple times right? And they got customers floppy

00:25:05
letters, right? They ifs writing was adjudicated

00:25:08
against them. It needed to go to the highest

00:25:10
levels. The effectively got, what was

00:25:12
then a scandal when Facebook had a called cross check?

00:25:14
Which is, which is, this is actually pretty common in

00:25:16
something you have to do with these companies.

00:25:18
Is when you have a very large account, you end up marking it.

00:25:21
So that just a normal day to day, content, moderation worker

00:25:24
can take it down like you, you can't run a social network where

00:25:27
the president United States can have their Reset any contract.

00:25:30
Right right. Right.

00:25:31
But nasty protected they protected the account that

00:25:34
normal content operations people could not take impact on it.

00:25:37
So certainly it looks like Twitter went out of their way to

00:25:40
allow two lives. A tick tock to continue to

00:25:42
exist. Do you think there is any

00:25:44
positive outcome to the broader public in?

00:25:46
Showing the way internal deliberations happen at social

00:25:49
media platforms? Because, you know, it, I do

00:25:51
think there's something interesting about the fact that

00:25:54
these were big stories at the time or there was like a lot of

00:25:56
attention paid to the claims by conservatives.

00:25:58
A shadow Banning and reporters did not manage to get these

00:26:01
documents. And right.

00:26:02
Kind of the, you know, whatever mainstream media reporters, did

00:26:05
not get these documents and write the stories that you know,

00:26:08
are coming out now because of Ilan.

00:26:09
I mean, you think there's some positive aspects to at least

00:26:12
that happening? Yes.

00:26:14
I mean, I think one is this is something I've actually been

00:26:16
saying for years. I have a talk in which I talk

00:26:18
about how these companies act in a quasi-governmental manner by,

00:26:22
they just do, and they are making decisions about people's

00:26:25
political speech, they have to if they want to.

00:26:29
Run these kinds of platforms and they the platform's be usable.

00:26:31
And they do not want like really bad outcomes, like, people dying

00:26:35
then, they have to act in the quasi-governmental manner, but

00:26:37
they do so without Democratic legitimacy or transparency.

00:26:40
And so for years, I've been talking about these companies to

00:26:42
be more transparent in their decisions.

00:26:44
One of the ways you could do that, maybe a positive outcome

00:26:46
and something I proposed to musk before we call.

00:26:48
Yo, attacked me for it was they could run a database for the

00:26:52
last 30 days. These are all the content.

00:26:54
Moderation decisions we've made and you could, they're

00:26:57
interesting privacy issues, but You could provide that for

00:26:59
people so that you can see whether or not there's a bias

00:27:02
because he's he's making the assertion that these decisions

00:27:05
were politically biased but he's not providing any evidence of

00:27:07
that. All he's doing is its allowing

00:27:10
me to Miriam the Trump people ask for you know it's just like

00:27:13
look at these case studies and we won't tell you about any of

00:27:16
the other case studies let alone.

00:27:18
Actually provide you data that would allow you to sort of

00:27:21
analyze it right now and I guarantee you've got like,

00:27:24
maybe, you know, BLM protesters who say things that could be

00:27:28
considered Violet and gets police officers getting the same

00:27:30
kind of limit. I totally expect.

00:27:32
You have antifa. Another kind of like far left

00:27:34
folks. Anarchists there have been tons

00:27:36
of complaints by pro-palestinian, protesters that

00:27:38
a claim that they are constantly getting, you know, whatever, D

00:27:41
boosted, Shadow band. And I mean, that would be really

00:27:43
frustrating thing about the take that came after, you know,

00:27:47
Barry's tweets was like, this is only happening on one side and

00:27:50
it's like, well, you're going to say it's only happening on one

00:27:52
side when it's reported by someone whose goal as a reporter

00:27:55
right now, is to show that the left is censorious and that the

00:27:58
It's goal, is to make sure that conservatives do not get the

00:28:01
read-through main stream channels.

00:28:02
You're not going to get an even-handed hearing from Barry

00:28:04
Weiss when it comes to, you know, leftist true leftist not

00:28:07
like center-left, but like actual antifa, or I don't say

00:28:11
auntie because that's loaded. But you know, like Palestinian

00:28:13
rights any any sort of? It's also a Muslim.

00:28:15
He's bad news. Your can be asymmetrical.

00:28:18
Like these people just assume that there's necessarily

00:28:22
symmetrical bad behavior and I just find it, right?

00:28:24
The totally absurd claim, which if you had a database here, You

00:28:28
could get different groups analyzing that data and then

00:28:32
publishing their methodology and doing peer reviewed work on we

00:28:35
think they're biased or not it in you're totally right time.

00:28:37
Like there are other groups that have a lot of complaints here.

00:28:39
I think the Palestinian you'll pro-palestinian groups have

00:28:42
pretty legitimate complaints and it goes exactly to the first

00:28:44
thread. The state of Israel runs what's

00:28:47
called an internet referral unit.

00:28:49
They have a full-time employees of the state of Israel, whose

00:28:52
entire job. It is to tell Twitter and

00:28:54
Facebook to take content down under Israeli law.

00:28:56
Is that Brad and well. Well, I think that is something

00:28:59
that is going to exist. And so, what we should have, is

00:29:02
complete and total transparency, and what content was taken down,

00:29:05
because a sovereign state said, it should be taken down, right?

00:29:09
And that's the kind of transparency that if they want

00:29:11
to provide would be great. I'm not sure Barry Weiss

00:29:13
believes that, that would be appropriate, but if she really

00:29:15
cared about the things she thinks she's caring about then

00:29:18
complete transparency of what is being moderated and in what

00:29:21
situations that was because of an external requests, I think is

00:29:24
required. And this is what is lost.

00:29:25
And I think the bad faith nature of these debates because Is so

00:29:29
hard to differentiate, you know, a supposed absolutist stance on

00:29:33
free speech. And the particular political

00:29:35
viewpoints that you had that you feel are being, you know,

00:29:38
kneecapped by the people that are in charge, because if you

00:29:41
truly did believe in these things, you being, you know, the

00:29:44
Free Speech, absolutist, Elon Musk, David sacks.

00:29:46
All these people that are bitching and moaning on Twitter,

00:29:48
they would not have released it to specific journalists who were

00:29:51
given specific instructions on how to disseminate this

00:29:54
information Jack, Dorsey even publicly a steal on you.

00:29:58
Should release all of these documents to all the journalists

00:30:01
who had to provide full transparency.

00:30:03
Which you were saying at the outset would be the only way

00:30:05
that you could have like, a true.

00:30:07
And fair, Reckoning of what was going on inside sweater during

00:30:09
this period, right? These guys are definitionally

00:30:12
useful, idiots, here they are targeting reporters.

00:30:15
Who want to see the world a certain way really want to

00:30:19
Professor Independence but are clearly aligned with delivering

00:30:23
the message that Elon Musk wants.

00:30:25
I mean, it's very similar to some of the Iraq War report.

00:30:29
And, you know, you would think Tybee would be sort of terrified

00:30:32
of becoming that person. I just find it ridiculous.

00:30:36
And the other thing I wanted flag is just and I think we've

00:30:38
all into that. This is just the sort of

00:30:41
American narcissism they like the great sort of political

00:30:46
speech challenge of our time is going to revolve around Libs of

00:30:49
tick-tock rather than China nation-states fascism like the

00:30:54
big questions like just the narcissism of it is mine.

00:30:58
Mind-boggling to me. Well, but this is okay, so if

00:31:01
you want to take their argument seriously, that has been true of

00:31:04
kind of the American Center left as well is that all of the

00:31:07
discussions were about Trump and not about the fact that 95% of

00:31:11
Facebook's users are outside the United States 80% of Twitter's

00:31:14
users outside the United States but those people are facing much

00:31:18
worse because one they live in states that have massive

00:31:21
censorship and propaganda outlets and to their countries

00:31:24
that don't have a First Amendment and so if you live in

00:31:26
India Hunter by his laptop, Seems quaint because right now

00:31:30
the Indian state is the most censorious democratic State on

00:31:33
the planet. They said more requests than any

00:31:36
other government to take down stuff on Twitter.

00:31:39
A lot of that. Yeah, it's on Twitter and face

00:31:40
of how much warranted, how much is Twitch India and then get to

00:31:44
China. Like, what?

00:31:45
How much is Twitter dealing with India and China right now?

00:31:47
Like how important are those platforms in those countries?

00:31:50
Right. So Twitter is blocked in China,

00:31:53
right? When you think about the risk

00:31:55
from China, it is that the People's Republic of China has a

00:31:58
A very large and growing propaganda capability.

00:32:01
That targets, Twitter traditionally, the Chinese

00:32:04
propaganda capability was focused on Chinese and

00:32:06
non-chinese platforms, and two incidents has changed over the

00:32:10
last five years. First of the series of

00:32:12
uprisings, in Hong Kong, the PRC found themselves.

00:32:15
Totally at a disadvantage versus these highly online.

00:32:19
Very good English. Speaking Hong Kong students who

00:32:21
were able to get their side of the story out.

00:32:24
And then the second was coated, was that, you know, with

00:32:26
everybody blaming China At some level, whatever you believe

00:32:30
about Wuhan. I think you could say, you know,

00:32:32
obviously covid came from China, whether it was natural or not

00:32:35
that they wanted to distract from that.

00:32:38
And also push the idea that China's response to covid-19.

00:32:58
About China, right eye. Has he said anything about it?

00:33:01
People are directly asking about it.

00:33:04
He has no stream sensitivity like he has factories in China.

00:33:08
Right there are obvious pressure points that he's sensitive to

00:33:11
any won't say a word about almost.

00:33:13
All of his net worth is in Tesla.

00:33:15
Stock Tesla has a massive gigafactory in Shanghai from,

00:33:19
which they will be producing cars to be shipped around the

00:33:21
world, but especially in Asia, I believe.

00:33:24
And China is already 25 percent of the revenue.

00:33:27
Clearly, if you look at like, Counted cash flows for the

00:33:29
future. China is going to be more than

00:33:31
50% right of their stock price. And so yes, the PRC has huge.

00:33:35
Leverage over him. It's like, if Mark Zuckerberg,

00:33:37
instead of having all his money in Facebook stock is money, was

00:33:40
like in a Chinese pharmaceutical company, right?

00:33:42
And it's like this is how this is something.

00:33:44
I've never really faced. Republicans were sincere.

00:33:46
They would be going insane over this.

00:33:48
I mean, this is any national security.

00:33:51
Serious person would be terrified of this situation.

00:33:54
Yes. Yeah.

00:33:56
Right. Because the team, so, right

00:33:58
before For the election, this got very little play but we

00:34:00
wrote this up a TI Partnership dotnet.

00:34:02
If you look at our blog for 2022, right?

00:34:05
Before the election, there were five disinformation networks

00:34:08
taken down in a coordinated work by Twitter and Facebook.

00:34:12
We did the right up in the analysis of who they were and it

00:34:15
was the Iranians in the Chinese most of it was anti-republican.

00:34:19
In fact, one of the group's was a, they created an entire fake

00:34:22
anti Rubio group in Florida fasten the Chinese did.

00:34:25
And so what end this? Because they think Republicans

00:34:28
on door. Sort of China or yes?

00:34:30
Right. And I think I think actually,

00:34:32
you are starting the the idea that foreign influence is

00:34:35
something that is only pro-republican is a very 2016

00:34:39
idea. What's happened is American,

00:34:42
democracy has become the World Cup of disinformation, where

00:34:46
everybody cares about our elections, even Our House and

00:34:49
Senate election of Iraq, where they don't get to vote in them.

00:34:51
It impacts every with, right, right.

00:34:54
And so, if you are a government, you've got your political leader

00:34:57
saying, Hey, why are we not playing here to if the Russians

00:35:01
are playing there and the Chinese.

00:35:02
And so you'll Iran and China were running.

00:35:04
Disinformation campaigns are talking about Twitter, that

00:35:06
entire team at Twitter that did that work is gone.

00:35:09
Every single one of them, not a single person that we used to

00:35:11
email to work on this stuff together is still there and

00:35:15
Republicans should care about that because a bunch of

00:35:17
disinformation on Twitter is anti-republican, right?

00:35:20
Is anti GOP is sometimes anti-trump.

00:35:22
But in many cases targeting in this case, like Rubio, very

00:35:25
specific kind of mainstream Republican politicians.

00:35:28
They see them as not beneficial to array on door China and so,

00:35:31
yes, I do think like we're stuck in this weird 2017 moment where

00:35:37
Republicans don't care about foreign influence and Democrats

00:35:39
do and therefore you can play to the right of saying that none of

00:35:43
this stuff is real but that just does not match the facts on the

00:35:45
ground as documented and that we will not be able to document

00:35:48
very much anymore. If nobody had Twitter is minding

00:35:51
the store so this is kind of a soft target but I feel like we

00:35:54
have to deal with it to an extent we've seen as Elon.

00:35:58
Has, you know, spent a little more time on Twitter.

00:36:01
There is a creeping arbitrariness behind his own,

00:36:03
moderation decisions. He has banned Kanye from the

00:36:06
platform for posting a swastika which is actually fine.

00:36:10
I would think like it's and you could release the process behind

00:36:12
that. I mean like shit.

00:36:14
There's simple dare. He is not a lot, Alex Jones back

00:36:16
on the platform Because he believes that, you know,

00:36:19
everything that he's said about Sandy Hook was, you know,

00:36:21
disgusting and harmful. And he doesn't want to be

00:36:24
running a platform that's like that.

00:36:25
So there's obviously a huge amount of hypocrisy blatant.

00:36:28
In the way that he is approaching content.

00:36:30
Moderation decisions. Where do you see this going with

00:36:33
him? Like, if he's still in this kind

00:36:35
of 2017 mindset, he's starting to come to terms with the fact

00:36:38
that a free-for-all absolute a stance on free speech is not

00:36:41
actually what he wants. Let's even take the economic

00:36:43
pressures from what advertisers want off the table.

00:36:46
Where do you see you on progressing?

00:36:49
In terms of building up or rebuilding some of the content?

00:36:52
Moderation structures that he took down?

00:36:54
I really don't know other than the thing I said, the day, he

00:36:58
closed the door. Deal.

00:36:58
Was he kind of bought himself into a world of pain here

00:37:01
because one he is exposed to all these countries.

00:37:04
He sells products all around the world.

00:37:06
He is somebody who is until this moment maintained a kind of

00:37:09
bipartisan respect for him. Another company.

00:37:12
He owns SpaceX I believe 80 90 % of the revenue comes from the US

00:37:16
government. They are effectively a defense

00:37:17
contractor to NASA and DOD and so he's kind of ruining his

00:37:21
reputation. With half the people who vote on

00:37:23
budgets, that give him billions of dollars.

00:37:25
And then by saying, I am the decider, there is no Policy

00:37:29
mechanisms. There's no counsel, there's no

00:37:32
discussion. You only have to Lobby me.

00:37:34
He has made himself kind of personally, if not legally

00:37:38
morally, responsible for everything happens on Twitter.

00:37:40
And I think one, he seems to be kind of spinning out of control

00:37:43
a little bit in his own interactions.

00:37:45
Like it's getting with more and more like he just called the

00:37:48
only bar. Yeah, he said I run up, right?

00:37:52
And so like I proposed to him, you know, that they should have

00:37:54
these transparency mechanisms, which is a totally nonpartisan,

00:37:58
you know, No idea. And he said I run a propaganda

00:38:00
platform, which, you know, we've got a couple of grad students.

00:38:03
I've got like four full-time employees at a bunch of real

00:38:05
smart kids who work for me. I'm not sure I'd call that a

00:38:07
platform but that's cool. Like if you want to say that but

00:38:10
like he's kind of spinning out of control and his interactions

00:38:12
with people and I think he's going to find.

00:38:15
This is not fun. It's also really affecting the

00:38:17
stock price Tesla stock has plummeted even worse than all

00:38:21
these other companies. I think part of it is he is

00:38:24
destroying the brand of Tesla and he is going to find that

00:38:27
doing this for the Lord. Jules is gonna have real

00:38:29
long-term economic impact on him.

00:38:31
I think in a year he's not going to own Twitter because I think

00:38:34
it's not going to be fun dealing with these issues.

00:38:36
They never stop. They will never not be a

00:38:38
moderation controversy on Twitter.

00:38:41
It is going to massively distract him and it's destroying

00:38:43
his brand and I say this is like I drive a Tesla.

00:38:46
I have a Tesla roof, I have power walls behind me back here.

00:38:49
So I I think Tesla has built incredible things and I don't

00:38:53
think I'd ever by Tesla product. Again, I've got 20 years of

00:38:55
depreciation, now on the roof that would have to live with but

00:38:58
I'm never buying In a car again and that's true for like, you

00:39:00
know, when you think about the people who by Tesla's, you're

00:39:02
talking about kind of urban and Suburban college educated people

00:39:05
with high incomes, he's not playing to that base.

00:39:09
Right, right, right. And I mean, yes, I've heard this

00:39:11
this argument from a lot of other people that it's like, if

00:39:14
you are and this is just the general issue, I think the

00:39:16
Republican party is maybe running into is as you fight

00:39:19
entirely on culture, War it matters but you are yourselves

00:39:21
you know a well-off group of people you kind of got to think

00:39:25
about who your audience is at a certain point and that's fine if

00:39:28
you feel like You were the voice for, you know, Middle America,

00:39:30
but you're also selling hell of expensive cars that.

00:39:33
I mean, good for you, if you can make cars for those people, but

00:39:35
by and large you really aren't, I know this is getting

00:39:38
psychological. But like, what do you think is

00:39:40
going on? Sometimes I'm like, it's their

00:39:42
children, their children are all, like, so left that they

00:39:44
can't even understand it and the culture that they're facing with

00:39:47
their children is alienating to them.

00:39:50
Why is he, like, huffing this sort of like, Libs of tick-tock

00:39:54
content to such a degree that he has been?

00:39:56
So, Radicalized a totally vacillates between saying that

00:40:02
he wants to like be on the side of the middle 80% and then he

00:40:07
says he's going to support Ron DeSantis and every every

00:40:10
account, he's validating is some weird right wing account.

00:40:14
Well that's where the business side gets really interesting too

00:40:16
because you can see the rationalization starting to come

00:40:19
together over there. That because they are taking

00:40:21
this quote-unquote absolutist. Free speech stance.

00:40:24
They're starting to lose advertisers and now they're

00:40:26
blaming. Whoa.

00:40:27
Mmm for all of this because the, you know, these advertisers

00:40:31
don't want to be associated with the platform that is, you know,

00:40:33
Riven with hate speech. But if you are an Advertiser,

00:40:36
you know, or a Big Brand, you are trying to sell the most

00:40:39
amount of products as possible, and you see like the sentiment

00:40:42
that most of the country has, which is most people don't like

00:40:45
seeing these things on their timeline.

00:40:46
And most people don't want to be surrounded by, you know, this

00:40:49
level of hate speech and so you do it, they can't even like rely

00:40:53
on the free market argument in order to prove that what they're

00:40:56
fighting for is like what the You know, most of the country

00:40:59
wants. Yeah, he is definitely Twitter

00:41:02
is definitely cash flow negative at this point, right?

00:41:05
Like there's never been a great business, it is over the last 10

00:41:07
years, you know. It's only had a couple of

00:41:08
quarters of profitability but effectively it was making as

00:41:11
much revenue as it spent. So it was going to be an ongoing

00:41:14
concern for the foreseeable future.

00:41:16
And what he did was he did reduce his costs by laying off

00:41:19
all these people but he also is massively destroyed as Revenue.

00:41:22
I'm sure he's increased engagement with all this

00:41:24
craziness but that is the supply of Advertising.

00:41:28
Right? So he's increase the supply in

00:41:29
the market place. Every time you see an ad on

00:41:32
Twitter or any other platform, that is the outcome of a

00:41:35
real-time bidding war between advertisers and I guarantee the

00:41:39
price of those Twitter ads have gone through the floor, because

00:41:42
the Big Brand advertisers who are willing to spend the big

00:41:45
money on. Those cpcs are cpms are not are

00:41:49
mostly gone, a couple or left like Amazon.

00:41:51
And then you saw that the daily Stormer guy.

00:41:54
When you go through his feed, the Amazon ads are on there.

00:41:56
So we just saw, you know, Quiet letter to advertisers from

00:41:59
Twitter saying, we're coming up with more brand safety stuff so

00:42:02
you can block certain accounts and such.

00:42:05
I don't know if that's going to be good enough, but I am sure

00:42:07
they're losing money and that doesn't only include the fact

00:42:10
that he, he borrowed a huge amount of money.

00:42:13
He attached to the new Twitter Corporation over a billion

00:42:16
dollars in interest payments per year.

00:42:18
And so, yes, he could massively, cut the staff and keep Twitter

00:42:21
running because all a lot of the hard engineering has been done,

00:42:23
and it's now in like a sustained engineering mode, but he's not

00:42:26
going to be able to make serious.

00:42:27
As changes in. So we talked about these big

00:42:29
ambitious things about building the everything app building

00:42:31
payments. Building this building that you

00:42:33
can't do that on a skeleton crew.

00:42:35
At least not safely. Once you already have a base of

00:42:37
hundreds of millions of users and so I think he's he's going

00:42:40
to find himself in this weird trap where he's going to have to

00:42:44
continue to subsidize it out of his personal net, worth which

00:42:46
again because of his actions is dropping right?

00:42:49
And so like the people who are really should be angry here,

00:42:51
probably Tesla shareholders because his continued sale of

00:42:54
Tesla stock plus his erratic behavior is Tanking Tesla stock

00:42:58
in a way that is great for the shorts of which I'm not.

00:43:03
And they have made the shorts and made billions.

00:43:05
But for normal Tesla stockholders they're the ones

00:43:07
holding the bag. You think liberals have said too

00:43:09
low of a bar for you on Twitter. Like do you think this whole

00:43:13
thing that website is just going to start like going down, or it

00:43:17
felt like there was a moment where I don't know, there was

00:43:20
just like a panic over, like, oh, this could all end, like

00:43:23
tomorrow, or how do you see that playing out?

00:43:26
I mean, he's just running more risk, right?

00:43:28
Like you could run Twitter with 300-400 people.

00:43:30
So you know, Twitter has about 500 servers in three main

00:43:33
data centers, as well as in a couple dozen pops, right?

00:43:36
So you need your data center operations people.

00:43:38
You need your infra people who manage the hardware and software

00:43:42
remotely and then you need your devops people, who keep it

00:43:45
running, that's your minimum right?

00:43:47
So you know, probably maybe 400 people.

00:43:49
And people you can keep it up and running in definitely if you

00:43:52
want to actually build this incredibly complicated app that

00:43:55
is effectively the American version of we chat.

00:43:57
Then you going to need hundreds and thousands of Engineers

00:44:00
designers product managers in the like and so is it just going

00:44:04
to go down? I think he's taking more risk

00:44:06
because the depth of their engineering Talent has

00:44:09
decreased. And I think one of the crazy

00:44:10
things he's done is he's done the layoffs in a way that is

00:44:14
incentivized people who have other options to lead the

00:44:17
craziest from like it just a Fleet take the politics out from

00:44:21
a Harvard, Business Review organizational management

00:44:23
perspective. Sending an email that says click

00:44:27
this button if you're so hardcore you want to have a

00:44:29
horrible life. If you don't, I will

00:44:32
automatically pay you. 90 days of severance going into the

00:44:34
holidays is just selecting, for people who don't think they can

00:44:38
get a job by January and people who have like H-1B visas and

00:44:42
such. And so, a bunch of people who

00:44:44
believe or people who believe, in whatever, you know, political

00:44:47
project, he is trying to advance through.

00:44:49
Yes. Fleur, which is I don't think is

00:44:51
a majority of the of the L7 engineers at Twitter.

00:44:56
I guarantee that the majority of them do not think he's doing a

00:44:59
good job and and and they those people there have been a lot of

00:45:02
layoffs and Tech but if you're at like an LS7 at Twitter and

00:45:04
you have operated at the scale, you will have a job in two weeks

00:45:07
and his vision isn't even clear. Like, even if you're a die-hard,

00:45:11
he alternates. So it's harder, you're really

00:45:13
sort of going with the, I don't know dear leader sensibility,

00:45:17
where you're supporting, right? Every Lon.

00:45:19
Capriciously wants to do, I mean it's worked from the past

00:45:23
because Tesla had a mission, we're going to make electric

00:45:25
cars real. We're going to save the Earth

00:45:27
SpaceX. We're getting Humanity off this

00:45:29
planet where reinvigorating American.

00:45:31
Those are incredible missions. Their missions of people will

00:45:33
take less money and work 80 hour weeks for.

00:45:36
There's no. What is the elevator pitch of

00:45:38
musk's Twitter? It's we're going to run this for

00:45:41
the personal gratification of musk.

00:45:43
That is not the kind of thing. You're like, oh, I'll never see

00:45:45
my kids and I'm fine with it. That's yeah.

00:45:48
That's gonna motivate You for that.

00:45:49
So this gets to the kind of interesting question, at least

00:45:52
interesting to us. And maybe question about the

00:45:53
journalist that are at the center of all of these

00:45:56
quote-unquote Revelations because, you know, we have with

00:45:59
Barry Weiss. Well, she's more or less just,

00:46:00
you know, become a partisan to the certain types of people that

00:46:04
believe that the left is an anti Free Speech group and they're,

00:46:07
you know, they are out there to cater to the whims of the

00:46:12
blue-haired pronoun people. And you have people like Matt

00:46:15
Taibbi, who I think is a great journalist.

00:46:17
I'd love to have him on Matt. If you hear this, we're down to

00:46:21
chat, but I think there is something very strange and

00:46:23
almost Insidious happening with the non mainstream journalists,

00:46:27
the ones that are contrarian basically by nature and that

00:46:30
they are trying to rebel against, you know, the

00:46:33
mainstream argument that's been pushed forward over the last

00:46:36
couple of years and the over reaction I would say over

00:46:39
reaction by the New York Times, the things like Russia gate or,

00:46:42
you know, Facebook content, moderation, all this other stuff

00:46:45
and they have ended up in a place like Eric mentioned

00:46:47
earlier in which they are willing.

00:46:49
NG to be mouthpieces and there's really no other word for it,

00:46:52
mouthpieces for the richest person in the world in which he

00:46:54
can, you know, release selectively, release documents

00:46:57
that are being pushed through the platform that he owns that

00:47:00
if nothing else is going to bring like it, did sit this

00:47:02
fucking thing. But from a journalistic aspect

00:47:04
here, how did we end up here? Is there anything positive to

00:47:07
say about deciding to carry water for Elon Musk in an

00:47:12
extremely untransparent way? My personal view on this is I'm

00:47:16
just very skeptical of like Conservative media project.

00:47:21
My first job out of college, I worked for the Washington

00:47:24
examiner, which was sort of a republican, billionaire funded

00:47:29
outlet. But I, you know, I took the job

00:47:31
as I was covering TC City Hall and you know, it was fairly

00:47:36
straight news sensibility. But ultimately like I found who

00:47:41
think the conservative impulse to be like pretty sloppy willing

00:47:44
to like throw up, you know, headline numbers of like budget

00:47:48
deficits. Without sort of much thinking

00:47:50
about the context and it was very similar to like what we're

00:47:53
seeing now. Where it's can it be truthful

00:47:55
enough that it's something that we can all get like angry about

00:47:59
and score points on and I don't quite know, culturally why I

00:48:03
like the right hasn't been able to build up great, conservative

00:48:07
media like disposition Ali, but I just, I've never seen them

00:48:11
successful and that's part of why I sort of have cheered for

00:48:14
like a musk Twitter. It's like okay, make the right

00:48:17
actually governed as they Say they want to but I mean I think

00:48:21
as we've all sort of said they I've seen no evidence that they

00:48:25
can I don't know our Alex are you as cynical as I am?

00:48:28
Why can't the right wing to liver sort of a coherent

00:48:32
argument here? What it's interesting when you

00:48:35
talk about Taibbi, I think unfortunately he's trending

00:48:38
towards Greenwald who the biggest enemy for Glenn

00:48:41
Greenwald is Westwind normally Democrats, right?

00:48:44
Right, like right West Wing watching, Obama voters, even

00:48:48
though he is. Moved to a country that has a

00:48:50
semi fascist dictator. Who is now fighting against yo,

00:48:54
Paulson roo-hoo Glenn Greenwald hates is now fighting against

00:48:58
being democratically-elected out and you know who's going to

00:49:01
support him in that, it looks like musk musk has been tweeting

00:49:03
things positively saying about, he's looking into whether bowls

00:49:07
in, Ro was improperly treated by Twitter that no matter what the

00:49:11
facts are that the worst people are just Normie.

00:49:15
Democrats it. I feel the same way about Ed

00:49:17
Snowden is like if One day, Ed Snowden is going when he is no

00:49:21
longer useful, something. Horrible is going to happen to

00:49:24
him in Moscow by the FSB and his last thoughts going to be the

00:49:29
real. Enemy is Barack Obama, right?

00:49:31
Like it's just you you have these people words.

00:49:34
They've got such anger that they have to then realign everything

00:49:38
else around the these like basic ideas of you know, basic

00:49:42
center-left kind of normal yo government in the United States

00:49:46
is the worst thing ever. So somehow either Trump is

00:49:49
President or, you know, populists around the world are

00:49:53
really the representatives of the people.

00:49:55
I do. Sadly think that the common

00:49:56
denominator in a lot of this ironically is Twitter that the

00:50:00
Twitter engagement mechanism. For a lot of people, especially

00:50:04
contrarians is just to get into fights with people.

00:50:07
Is it to score points in the midst of some sort of Twitter

00:50:09
argument. And that by definition is going

00:50:12
to push you into taking very bizarre stances and, and into

00:50:16
strange corners. And I see someone like Greenwald

00:50:19
you Journalistically at the stuff that he's pushing out

00:50:21
through his sub stack and you know, if he still publishing

00:50:24
articles anywhere else still pretty strong stuff.

00:50:26
I mean, you mentioned Brazil, you know, he was responsible for

00:50:29
the dissemination of a lot of files that were leaked in

00:50:32
Brazil, that helped exonerate Lula, who is now?

00:50:35
Maybe going to be the president there.

00:50:36
So I again, like when it gets to the moral compass, I still think

00:50:40
it's point in the right direction, but because Twitter

00:50:42
has dominated so much of the way journalists view their job and

00:50:47
and the way to establish their brand.

00:50:49
That it's just going to end up in this place.

00:50:50
We're really good. Journalists are willing to make

00:50:53
concessions to some of the least, good faith, actors out

00:50:57
there and ones that at every other way.

00:50:58
They do not agree with them politically just that they can

00:51:01
get, you know, score points against Normie Democrats.

00:51:03
And it's super depressing. There's no other word for it.

00:51:06
I cuz I do respect these people but this is not the answer, I

00:51:09
think. Yeah, the interesting thing is

00:51:11
talked about yesterday to is this strategy of bringing these

00:51:14
people into look through your files.

00:51:16
It's okay. To let you know, I don't It's

00:51:19
journalistically appropriate to only do it for two, but if

00:51:21
you're going to let people look through your internal

00:51:22
Communications but musk has a legal responsibility to protect

00:51:26
user data. So if we continue down this path

00:51:29
where now Taibbi and Weiss and they're kind of political those

00:51:34
kinds of political actors are able to go look through user

00:51:37
data than musk is in the world of hurt and adjust.

00:51:39
While we're recording this, it was announced something I

00:51:41
predicted yesterday has happened which is Irish data protection

00:51:44
commission, is now looking into what kind of access Barry Weiss

00:51:46
had because having access to these Internal interfaces is

00:51:50
exactly the thing that got Twitter and trouble over and

00:51:53
over again because outside hackers or people who are

00:51:56
working for the Kingdom of Saudi Arabia, had access to user data.

00:52:00
And so Twitter has agreements with the FTC to not allow random

00:52:04
people to get access to Twitter user data.

00:52:07
And they have such agreements with the Irish TPC, which is the

00:52:10
most important regulator for them and what sort of data in

00:52:12
specific is the most sensitive. I mean, DMS obviously, but, like

00:52:16
outside of that, what do you think is of major concern to

00:52:18
regulators? So in the u.s.

00:52:20
DMS for sure because that is the only data for which there's like

00:52:24
a straight black letter law. That Twitter is responsible for

00:52:27
something called the stored Communications act, which is a

00:52:29
pretty old law was signed by Ronald Reagan, it really was

00:52:33
meant to apply to the phone companies and very early email

00:52:36
systems but the stored Communications act 18, USC 2702,

00:52:40
specifically says that if you have, if you're a holder of

00:52:44
stored Communications of people, you cannot release it to anybody

00:52:47
except under certain In circumstances, and we're having

00:52:51
a lot of fun for the LOLs is not turns out, not in the law,

00:52:54
Ronald Reagan signed that you can't, you can't release SCA

00:52:57
cover material for trolling for the FTC is going to be a much

00:53:01
broader set of IP address phone number.

00:53:04
Anything that is non-public information is covered by the

00:53:06
FTC consent decree, which already some of the stuff that's

00:53:09
in the interfaces that Weiss looked at that.

00:53:12
It's not clear. Whether Weiss was actually hands

00:53:14
on keyboard or whether she was just looking over the shoulder.

00:53:17
But in either case, Probably that was a violation.

00:53:20
The FTC consent decree. This can put journalists on a

00:53:23
weird side here where we're going to be rooting or at least

00:53:26
some of us are going to be rooting for reporters.

00:53:29
Quote on quote, to get in trouble for violating Twitter's

00:53:32
data security. So, there's a difference between

00:53:35
Elite email between employees or a document inside and user data,

00:53:39
right? And for the most part, when you

00:53:40
talk about like the Facebook files and such of situations

00:53:43
where journalists have cheered leaks, those are generally

00:53:47
internal correspondence between Tween employees.

00:53:50
That at least, where I saw were generally, those journalists

00:53:53
were then, very careful to take out any pii.

00:53:57
But in this case like I think like, the fear is that the next

00:54:01
step of like musk on this path is, he's just going to let these

00:54:04
people get access to DMS and he does.

00:54:06
So he is just straight black letter, not only be federal law,

00:54:09
lunacy, I mean we yeah, I mean I love the stuff that's happening

00:54:13
today. We would say was a month, right?

00:54:15
Well I mean, you know, the legal parameters around what Twitter

00:54:19
Be doing is already getting brittle because we saw Elon very

00:54:24
publicly fire was named James Baker.

00:54:26
The, the company's was it associate general counsel or

00:54:29
when I like the high up treating, sorry.

00:54:31
Deputy general counsel who I believe was vetting these

00:54:33
documents around the time that, you know, Barry Weiss was

00:54:36
looking at them. And this was depicted as some

00:54:38
sort of Nefarious act where I imagine imagine his argument

00:54:42
this is James Baker's argument is like we're about to dump a

00:54:44
whole bunch of internal shit here.

00:54:46
I think I'd want to know whether or not this is going to be in

00:54:48
violation. Of federal statutory lines.

00:54:50
Just like, I didn't know you worked for me like, you know,

00:54:52
like that's sort of builds his credibility.

00:54:56
It's I mean, it was that it's one of his top Warriors if not

00:54:59
top lawyer and the and Elon doesn't know who they might have

00:55:02
been, what did I mean? There's no general counsel.

00:55:04
So, apparently his personal lawyer who I am.

00:55:07
Shocked Quinn Emanuel still has him as an employee, but I guess

00:55:10
he's created a huge amount of litigation business 4qe, but

00:55:13
apparently his personal lawyers, effectively the GC now and I

00:55:17
don't know how the GC would not know who their deputy.

00:55:19
Don't councils were generally expect considering Baker's

00:55:22
background. I don't know him.

00:55:23
I don't know exactly what he did, but every tech companies

00:55:26
got somebody like that who came from either like the National

00:55:28
Security division of doj or FBI because when you operate the

00:55:33
scale you get legal requests from governments around the

00:55:36
world continuously. And so you need an entire legal

00:55:38
department, the thinks about those things.

00:55:40
And so, yes, I guarantee. One of the things in baker's,

00:55:42
internal analysis would be. We have sca ekkuva, as well as

00:55:47
FTC and Irish. And protection commission

00:55:50
commitments that we need to follow when doing this and like

00:55:53
oh that's a bunch of legal gobbledygook, you're fired

00:55:55
right? So I mean, you can YOLO through

00:55:57
this, but the truth is when he bought Twitter, Twitter was

00:56:00
already had already twice made a deal with the FTC so he

00:56:05
purchased a company that is already kind of under consent.

00:56:09
Decree that has been looked over by a federal judge, his ability

00:56:12
to fight any of this was effectively already given away

00:56:15
by previous administrations of Twitter that's part of the

00:56:18
liability. He purchased so if he thinks he

00:56:21
can just be like, oh, it's a reset because I bought it.

00:56:24
That's not how any of this works.

00:56:25
And more importantly, the baker is on the same day.

00:56:28
His chief information security officer Chief compliance

00:56:31
officer, and Chief privacy officer all resigned, and that

00:56:34
happened to be the day that they were supposed to sign a letter

00:56:37
to the FTC. So, I think it is highly likely.

00:56:40
There's already a quiet FTC investigation going on because

00:56:43
Twitter already missed a deadline, under their consent

00:56:45
decree and all of the people that the FTC used to integrate

00:56:48
used to work, With to make sure that Twitter was protecting user

00:56:51
data are gone and if you violate the terms of your consent,

00:56:55
decree, I mean, what are the penalties for that?

00:56:57
Is it a fine? Is it, you know, you lose your

00:57:00
you're tweeting license? Like what exactly happens.

00:57:03
I mean, believe the only things I've ever heard of is FTC can

00:57:05
find you, but the amount they can do.

00:57:07
So could be pretty significant. Yo, the largest right now is 5

00:57:10
billion to Facebook to Facebook, that was nothing five billion

00:57:13
dollars to Twitter, which is now losing two to three billion

00:57:15
dollars a year. Probably would be disaster

00:57:17
illness. That would that would be More

00:57:19
than that. I mean, five billion dollars

00:57:20
would be more than all the cash equivalents.

00:57:22
All the liquid wealth of Twitter.

00:57:24
So if like the FTC matched, the Facebook fine musk would have to

00:57:27
go sell Tesla stock. And then recapitalize the

00:57:30
company to keep it a going concern are fines based on the

00:57:33
value of the company or fines based purely on sort of how

00:57:37
egregious the infraction was. It is based upon the politics

00:57:41
and the negotiation and the judge and it is where it is

00:57:45
based on the size of the companies in Europe.

00:57:47
And there are limits to how much you can.

00:57:49
Fine. But also, they could be ordered

00:57:50
to cease operations in Europe. That would be a possibility to

00:57:53
get super reflective on this. I mean, I premise of the

00:57:56
conversation going in is like, okay, we want to take them as

00:58:01
seriously as possible. I mean, it's amazingly hard to

00:58:04
actually engage with like the Elon Musk Camp directly.

00:58:08
I'm elon's not out there, giving tough interviews and even like

00:58:11
getting proxies for him is very difficult.

00:58:15
You know we had Jason calacanis on, he wouldn't talk about it.

00:58:19
So anyway, we started this conversation off with the

00:58:21
premise that you know, we wanted to sort of take them seriously.

00:58:25
I wanted to just raise like is that a mistake?

00:58:27
Like do you think there's a point where we just need to sort

00:58:30
of just stop taking them? So seriously if these arguments

00:58:34
are so sloppy, yeah, I think you have to take must seriously

00:58:38
because he has incredible power now, like his ability to shape

00:58:41
the conversation in the middle American political class.

00:58:44
He is now the most important person.

00:58:45
Well, you don't have to take him as sincere.

00:58:47
In fact, he gives a lot Of evidence.

00:58:49
These not, he contradicts himself every other day.

00:58:52
And yet, you see reporters sort of constantly taking him at his

00:58:56
word. I don't know.

00:58:57
At some level we have to say, somebody's not trustworthy,

00:59:01
they're not straight forward. I think you're right musk, like

00:59:03
it's hard to take him completely seriously.

00:59:05
Now, even though he's got this girl power of more interesting

00:59:08
to see mentioned calacanis, not being willing to engage on this.

00:59:12
I'm putting down a marker. Now, the musk bubble in Tech is

00:59:15
going to pop, right? That right now, it has become

00:59:18
trendy Seen as counterculture to be seen on team musk and there's

00:59:21
a bunch of people who people who I used to work with people who

00:59:24
I've interacted with socially, who are smart serious, people

00:59:28
who are now kind of waving the musk flag and just like with

00:59:32
Trump, just like with FTX that bubble is going to pop and

00:59:35
you're going to see all these people.

00:59:37
All of a sudden try to rewrite history that they were just oh,

00:59:40
you know. Yeah, I invested a little bit

00:59:41
money or I gave him some suggestions because like musk is

00:59:44
accelerating his kind of break down here and if you end up with

00:59:48
Twitter, going out of business. Business him, having to give up

00:59:51
the Twitter to the bond holders or that the debt holders.

00:59:55
If you see him having to step down, as CEO as Tesla, if you

00:59:58
see some kind of massive moment or, you know, if there's like a

01:00:02
horrible violent act, that happens publicly.

01:00:04
If there is God forbid, something of the level of a

01:00:07
Christ Church shooting or something that gets attached

01:00:09
back to musk's moderation decisions.

01:00:12
All the sudden, all these people who thought it was really cool

01:00:15
to be on team, musk are going to reverse themselves and so I hope

01:00:18
people are taking Screenshots because you're right, it's just

01:00:21
a very scary. Like, there's just scary impulse

01:00:24
in the valley right now. Like a lot of what the New York

01:00:26
Times wrote about people in Silicon Valley, in 2018 was, not

01:00:29
that correct. But now, a bunch of things they

01:00:31
said, then are now applying to 2022, right?

01:00:33
About the politics of individuals, and the fact that

01:00:36
he has become this Pied Piper for otherwise, serious people

01:00:40
whose I've been to their house. I've had dinner with their

01:00:42
family and now they've turned into, it feels in Silicon

01:00:45
Valley, a little bit like in the, you know, after Trump was

01:00:48
elected. The family's got kind of rivet

01:00:50
right? It feels a little bit like that

01:00:51
in the valley in that a bunch of otherwise, serious.

01:00:54
Smart people are now in this kind of orbit and are going to

01:00:58
have to it will be interesting to see when like calacanis being

01:01:02
that quiet about it. I thought was the start like

01:01:04
his, he's very smart about his Public Image and I think that is

01:01:07
the start of an indication that people are going to start off.

01:01:10
I took him as being quiet because anything he says gets

01:01:13
the tribute to Ilan and then Eli gets mad at him.

01:01:15
I always sort of saw it in the DMS, the text.

01:01:19
Is yeah. Sorry the text messages were

01:01:21
calcaneus is getting chided for being too loud.

01:01:23
So I took it that way honestly, as much as we enjoy Jason I mean

01:01:28
I feel like he's pretty much rolling over.

01:01:30
I mean if you listen on all in he's not sort of Defending sort

01:01:34
of the democratic line he's just sort of letting he's pretty Pro

01:01:38
you oniy. I yeah I think he's he's very

01:01:40
much sort of capitulated to his co-hosts who love sort of the

01:01:44
musk party and love to shit on Democrats right now.

01:01:48
I mean, By the way, if I were an editor, the idea that, you know,

01:01:52
Ilan is the new Trump when it comes to dividing families and

01:01:54
Silicon Valley, and the uncomfortable conversation have

01:01:56
the table. Excellent story.

01:01:59
Yeah, just threw that smart. That's very smart.

01:02:02
The New York Times wasn't on a strike.

01:02:03
They could write that it's the perfect New York Times style

01:02:06
section story. These always want to be

01:02:08
Optimist. I mean that's they feel like I

01:02:10
mean it's been a winner to be Pro.

01:02:12
He Lon, you know I mean I'm Marc Andreessen was somewhat

01:02:15
defensive role of Elizabeth Holmes for a period like If you

01:02:19
like the positioning over and over again has just been defend

01:02:22
sort of crazy Optimist. Well in in driessen's, one of

01:02:26
those guys, I'm very distant, very sad about because I used to

01:02:28
work for him at loud Cloud. I reported to him on the board.

01:02:32
I'm an LP and Andreessen Horowitz.

01:02:33
I get a K1 every year from him and he blocked me on Twitter and

01:02:37
then was subtweeting me last night and is gone.

01:02:40
Kind of full. What's the deal?

01:02:41
I mean, we're a difference. Is that with Marc Andreessen?

01:02:44
What's the deal? I mean he's never been the most

01:02:47
kind of empathetic guy like one of my first - Marc Andreessen

01:02:51
moments was after I joined loud Cloud.

01:02:53
It was during the dot bomb and we were doing a layoff and Ben

01:02:56
Horowitz who actually is, like, who he says he is, was up there

01:03:01
in tears talking to the company about how he had to lay people

01:03:04
off. And how horrible was to do this

01:03:06
to the family, but it was necessary to move forward.

01:03:09
And Mark was over on the side on his BlackBerry typing out, an

01:03:12
email like, not even paying attention, right?

01:03:14
And I was like, that's the difference between bedded Mark

01:03:17
and so, yeah. I mean, he's never He's I think

01:03:19
he's one of the things is that money is disconnected from the

01:03:22
world. Like he lives in like a palace

01:03:24
and Atherton with high walls. And I don't know if you saw but

01:03:27
you know, he was part of the group trying to keep a yeah.

01:03:30
The whole episode about that. Yeah.

01:03:32
We're we were very interested about house.

01:03:33
Yeah. Three million dollars per condo

01:03:35
right like cheap housing which is they have to have to hire

01:03:39
round-the-clock security now because there are, you know,

01:03:41
people moving in low-rent people moving into Atherton.

01:03:44
It's pretty dangerous. Hell yeah.

01:03:45
And so he's I think like, you know, there's these people in

01:03:47
the valley to me that The moment at which you start to completely

01:03:50
lose touch is when you have enough money to fly private,

01:03:53
that that's like the last situation in which you ever have

01:03:56
to mix with normal people is like an airport and so if you're

01:03:59
like going in a armored SUV to the airport and you're flying

01:04:03
private and you're going to Davos and you never have to kind

01:04:06
of interact in normal people. There's a bunch of people in the

01:04:08
valley who are at that level and they're making that their

01:04:11
ideology. You know, Balaji is out there.

01:04:13
Don't give interviews don't reply to comment, you know,

01:04:15
they, they sort of convince themselves that not engagement.

01:04:19
It's sort of principled. And so, then they make their

01:04:22
sort of self-isolation complete. I don't know, I just wish they'd

01:04:27
communicate an essay format. Like, I feel like for people who

01:04:30
claim to be so smart that all their arguments emerge in like,

01:04:34
tweets is just there's also like an inability to like accept the

01:04:38
L. I mean, you saw that after the

01:04:40
midterms, you know, the people that were the loudest in terms

01:04:43
of you know woke people are destroying America that campaign

01:04:46
tactic didn't work nearly as well as you would have expected

01:04:48
it. To and instead of saying, hey,

01:04:50
maybe we should recalibrate because our viewpoints aren't as

01:04:52
popular as we thought they were you know they just avoid the

01:04:55
topic right? I know the midterms, it felt

01:04:57
like a moment where ya in Silicon Valley, sort of the

01:05:00
sense that the based crowd or the you know these meme warriors

01:05:05
were going to like win. It just seemed totally out of

01:05:07
touch with an America that clearly was very worried about

01:05:11
Normie lib stuff. Yeah, to zoom back at the

01:05:14
midterms. I think the positive thing here

01:05:15
is that, finally, a significant part of the profession Holcomb

01:05:19
party is starting to understand that telling your voters that

01:05:24
voting is rigged that voting early is a bad idea that they

01:05:27
should not. You know, Kerry Lake was telling

01:05:29
people not to put their ballot in the backup box if the local

01:05:33
scanner that kind of stuff is going to become dispositive in

01:05:36
these elections where it's 15 or 20 rights.

01:05:39
And so that is the what I think that positive thing that came

01:05:42
out of the midterms was that finally people are figured out

01:05:46
election denialism is actually in a democracy.

01:05:48
Is a losing, you can fight it but in the long run it is a

01:05:51
losing battle because you're telling people to be politically

01:05:54
disconnect and also you know just to bring this back to the

01:05:57
Twitter files, you know. I think if these stories you

01:06:00
know or the revelations do not end up getting the purchase and

01:06:03
reach that they were hoping to because they decided to they

01:06:06
being Elon Musk and that whole crowd decided to completely

01:06:09
circumvent, you know, the mainstream media's role in, you

01:06:12
know, the national discussion, then maybe you should think, you

01:06:15
know, they're not exactly, always going to be the enemy and

01:06:19
You have to at least position this in a somewhat neutral way

01:06:22
that you know the broader public at least as far as the broader

01:06:25
public, that reads the mainstream media will want to

01:06:27
engage in this stuff. So you know, if you guys didn't

01:06:29
end up getting what you wanted, through all of this, maybe some

01:06:32
self-reflection would be in order.

01:06:33
Are you surprised the Marc Andreessen still on the Facebook

01:06:36
or I am but you'll Peter teal made it to do.

01:06:41
You think it's just being a loyalist is great and why we

01:06:44
must get rid of somebody else just like even though like

01:06:47
clearly and Driessen disagrees with Zuckerberg on how he's

01:06:50
governing that at least he's like super differential, right?

01:06:54
Like is up, I think in the situations in which the board

01:06:59
could have put some controls on zaku.

01:07:01
There, been multiple situations in which such as ask the board

01:07:04
to kind of stamp his corporate here.

01:07:06
The creation of ultra voting shares, like that, after such an

01:07:10
incident. Yes.

01:07:12
Yeah. So in whatever they disagree

01:07:14
with politically, you're right, and every situation were duck

01:07:18
needed. His vote.

01:07:18
Vote. He got his vote.

01:07:20
So yeah, I think at this point, he never sucked isn't get rid of

01:07:24
him unless he does something like, teal, go.

01:07:26
So politically outside. But what what's happened is?

01:07:29
Unfortunately, the board is much more subservient to suck than it

01:07:32
was during my day. Like there were really

01:07:34
independent directors Erskine Bowles, you know, for example

01:07:39
who ask lots of questions and such and I feel like the

01:07:42
Facebook board has become a rubber stamp which is, you know,

01:07:45
I mean the upside for Zack is He'll see looks incredible in

01:07:51
the fuck compared to like, like the bar has been set, low solo

01:07:54
bike, like the fact that he is not, you know, personally,

01:07:57
trolling people that he has structural kind of is an

01:08:02
oversight board. Suck has has an oversight board

01:08:05
and other internal structures for making decisions that he

01:08:08
isn't out there personally going back and forth, back and forth

01:08:10
on things, makes it look really good and it makes the, you know,

01:08:13
people thought the oversight board was a crazy idea.

01:08:15
He duck hated being the guy that was just Possible for Content

01:08:18
moderation so much. He spent two hundred million

01:08:20
dollars to build this oversight board, and then musk spent forty

01:08:24
four billion dollars to become the guy that tried to get out of

01:08:27
right. Very well, here's my last

01:08:29
question and we can, you know, try to spin it forward or at

01:08:32
least as broadly as possible. Is there anything that can be

01:08:36
learned about concept moderation, by the other

01:08:38
platforms by the debacle that has been playing out here in

01:08:42
terms of elon's approach to moderation, is there something

01:08:45
that can be said at least reaffirming those?

01:08:46
Who felt that? Linking content.

01:08:49
Oversight board is Meaningful. You know the dialogue between

01:08:52
platforms and countries which, you know, I think that's a

01:08:54
fascinating thing that not enough people have written

01:08:56
about, I don't know, just try to give me some sort of optimism

01:08:59
about a positive outcome from, you know, the implosion that

01:09:03
we're witnessing here, I think. I think the lesson has been that

01:09:08
having procedural mechanisms around content moderation, where

01:09:12
you set a standard and then you have a bunch of smart people

01:09:15
argue about whether certain content violates that standard

01:09:17
or not. Not while that seems like the

01:09:20
worst way to do it is the worst way to do content moderation,

01:09:23
except all the others, right? It has demonstrated, that there

01:09:25
is a real value to not create a situation where lobbying one

01:09:29
person is responsible at traditionally, Twitter's smear

01:09:33
this responsibility over dozens of folks, none of whom could

01:09:36
that be said. This person is responsible for

01:09:38
what happens because I don't think musk understands the

01:09:41
people. He's dealing with our there's

01:09:43
going to be violence right. There is going to be violence,

01:09:46
tied to the speech that is happening on.

01:09:48
Litter. That is a constant challenge of

01:09:50
balancing, real world, violent Impact versus trying to protect

01:09:54
political speech. And he's gone very hard to one

01:09:56
side and that violence will be personally if not legally

01:09:59
attached to him morally attached and it might be legally to Ask

01:10:02
Resources case Gonzalez in the Supreme Court, where effectively

01:10:05
section 230 protections around certain kind of violent acts,

01:10:09
that are encouraged by platforms, would be attached to

01:10:12
the platforms, as a responsibility.

01:10:14
And so like he right now, he's got some section 230 protections

01:10:17
but it might not make it past the Supreme Court especially

01:10:20
when Republicans are in favor of repealing section.

01:10:22
Well, as the nuclear option about the, do you think, do you

01:10:27
think we will see any legislation?

01:10:29
I mean, do you think I mean we could solve this transparency

01:10:32
problem instead of just leak, you know, selectively leaking

01:10:35
then? Yes.

01:10:36
Yeah. Do you think they'll be a

01:10:37
legislation, right? So my, I mean, my colleague make

01:10:39
personally has put out a platform transparency, act that

01:10:42
has gotten bipartisan co-sponsors, so I'd love maybe.

01:10:46
This is an opportunity. I'd love to see.

01:10:48
Support required transparency. That he could, he could

01:10:51
voluntarily do it but then it gets back stopped by law that

01:10:54
his competitors have to have transparency for the most part.

01:10:56
I don't think they'll be regulation in the US.

01:10:58
This will be what we most important for Twitter is going

01:11:00
to be FTC now, right? Like they're pretty clearly in

01:11:03
violation of their consent decree, whether they can cure

01:11:05
that or end up getting fine, will be an interesting question.

01:11:08
And then in Europe, Twitter is now the number one target of the

01:11:11
general Services act. The Europeans are so happy

01:11:14
because they wanted. What would happen would fail to

01:11:17
GDP are They didn't beat somebody up on the first day of

01:11:19
prison, right? So like GPR was this real slow

01:11:22
burn where they tried to make, they wanted to punish a Facebook

01:11:26
or Google, but for a bunch of procedural reasons, they weren't

01:11:29
able to do so quickly, but DSA I think now that he is acting.

01:11:33
So kind of out of the norm and he's also fired all the people

01:11:37
that would normally fight the TSA for him.

01:11:39
I know everybody wants to make this about the Democratic party

01:11:42
but then we've got the Europeans over there who are you to the

01:11:45
left of the Democrats, right? Right?

01:11:48
Well as easy as Elon would say popcorn Emoji popcorn Emoji.

01:11:51
Anyway Alex, thank you so much for coming on.

01:11:54
Yeah, thanks this is great. Thanks for doing this, Alex,

01:11:56
thanks guys, have the right? Goodbye, goodbye.

01:12:11
Goodbye goodbye, goodbye. Goodbye.