Advertisement

SKIP ADVERTISEMENT
bars
0:00/54:21
-54:21

transcript

Can Big Tech Make Sure That 2020 Is Not 2016?

Facebook’s former chief security officer on what Big Tech needs to do for a free(r) and fair(er) election.

[theme music]

- When you walk in the room do you have Sway?

kara swisher

I love disaster movies, and I love action movies. I especially love movies where someone knows before anyone else that something bad is about to strike humanity.

clip from "tomb raider" 1

It could kill millions of innocent people.

clip from "tomb raider" 2

Now you’re being dramatic.

clip from "tomb raider" 3

Put it back.

clip from "tomb raider" 4

No!

kara swisher

But no one ever listens to those people. And I’ve always thought of Alex Stamos in this way.

[music]

Over the past decade, Stamos has held the top security jobs at two of the world’s most powerful tech companies. In both cases, Stamos tried to steer his bosses against bad decisions — first at Yahoo, where the company was asked by the U.S. government to spy on its email customers, and later at Facebook when that company was slow to disclose a Russian campaign to interfere in the 2016 elections. And in both cases, after a lot of wrangling with the top brass, Stamos ended up leaving.

Now, he’s on his own advising companies like Zoom on cybersecurity and presiding over the newly-founded internet observatory at Stanford Cyber Policy Center. With disinformation, hate speech, and election insecurity looming, Stamos’ warnings are as urgent as the mythical Cassandra’s. Will his go ignored, too, or will we listen now? This is part one of “Sway’s” two-parter about the people trying to stave off a 2020 election apocalypse.

Alex, welcome to “Sway,” thank you for coming on.

alex stamos

Hey, thanks for having me here.

kara swisher

So here’s why I had you on because I know you really like me because you actually quote me in your Twitter bio. It’s from when I described you as an irritant. Explain why that is. I like irritants, and I like people who disagree with me, although people would imagine I might not. So why don’t you explain to me why you irritate me so much?

alex stamos

[CHUCKLES] Why I irritate you. Well, I think you partially are describing that I have been irritating to a number of companies that I’ve worked at, and that I’ve been the chief security officer at Yahoo and Facebook. And unfortunately, that’s a job when you are the highest up executive in a company that doesn’t believe in the company’s mission and thinks everything’s terrible, right? That is effectively the job of the chief security officer is to look only for downsides.

kara swisher

Explain what the chief security officer does. Basically, as you once said, they’re all expense and not profit.

alex stamos

That’s right.

kara swisher

But what are you supposed to do?

alex stamos

We’re all L and no P. We’re all loss and no profit. So it’s interesting that the traditional role of a chief information security officer is to be the highest person the company whose job it is to defend systems from attack. This is a job that exists all over the Fortune 500. So if you’re a big bank, if you’re an oil and gas company, you’re a defense contractor, you have a CISO. We often call them CISOs, is what people say.

kara swisher

[CHUCKLES] Fun party. Go ahead.

alex stamos

Yeah, fun party. And the CISO is running a team to keep bad guys from breaking in.

kara swisher

Right. You’re a cop, essentially.

alex stamos

Yes, exactly. You’re there to detect when people are trying to break in to do things. Now, if you’re an oil and gas company, people are breaking in to steal money or to steal data or to break things. In Silicon Valley, the CISO job has been a little bit different in that when you’re building platforms for other people to use, the job is not just about the company being attacked, but it becomes about detecting and stopping the use of the platform to cause harm. And that’s this whole other area that is often called trust and safety in Silicon Valley. It’s a whole area where you have dedicated people that are working on it. And depending on the structure of the company, the CISO can have a lot to do with that, or it could just be a small part of their job.

kara swisher

You also are working for tech people who think they know everything about technology, too. But talk about the idea of the difficulty of operating in a technological environment.

alex stamos

Well, yeah. Running a security team in a tech company does have different challenges because like you said, what you have is often executives who are themselves technologists. And the other thing about Silicon Valley that I think is both a great power and a great weakness is people really do believe in the mission of the companies.

There’s a lot of people work on Wall Street who are just working on Wall Street. They don’t actually — When Lloyd Blankfein said we’re doing God’s work, he was joking. There’s people who work for Goldman Sachs are just doing it to make money. And so people just show up, and it’s your job. And then you go home, and you have a different identity at home. Whereas in tech, we still have this thing where people go to work for these companies because they believe we’re changing the world. And it’s like this port —

kara swisher

Oh, come on, Alex, they like the money. I know you guys think that, but —

alex stamos

Oh, it’s absolutely about the money, too, but people really believe that they’re doing something right. And as a result, that actually makes them more resistant to change in some ways. That if you’re just doing it for the money, and IT is just a tool you use to make money, and you’re, like, oh, we’ve got to change things because we have a security issue, then it’s just about risk management and what the cost is of the downsides and such. But in tech, when you’re talking about oh, we have to change something and it goes to the core of our mission, then you’re really trying to sell people on the idea that, oh, the things you’ve been doing so far were actually not that great. Your core belief in your goodness and the things you’ve been doing is not that correct. There’s something wrong. And that sometimes makes it a lot harder than it would be to do security in a company where IT is just a tool.

kara swisher

So at Facebook, when you were working, the idea is that we’re here to connect people. You were the person in the room that says, you’re connecting people; you’re also putting them in danger.

alex stamos

That’s right. And to be clear, I was not the only person — I don’t want to take credit. There’s a lot of people who work on various safety and trust aspects of Facebook. But I was the most visible from the outside, and that was part of my role. And then I guess, in some ways, I was the highest up inside the company, although never really that high. Nobody in the true inner circle at Facebook is a Cassandra. There’s a lot of Pollyannas. There’s a lot of people who really believe in the mission. And it’s actually tough to break into the inner circle if you’re a negative Nelly all the time.

kara swisher

Mm-hmm. So explain that. You’re a Cassandra; you tell them the bad things. What is the reaction? And I don’t think this is particular to Facebook, but when you say Mark surrounds himself with people who are positive or wants positivity, how does that manifest when you show up and say, that thing has to stop?

alex stamos

Right. And so yes, you’re right, it’s not just about Facebook. I think the specific issue for Facebook is, again, one of Mark’s real strengths has turned into a weakness, which is, when he started the company, he was just a kid. And a difference between him and a bunch of other college founders is he pretty quickly learned that there’s a lot of stuff he didn’t know. And so he surrounded himself with adults, like people in their 30s and 40s, people who had been at Google and Yahoo and a bunch of other Silicon Valley companies. And he built this group around him where he was still in charge. He’s still making the decisions, but that he recognized, well, I actually don’t know a lot of this stuff. He didn’t know about advertising. He didn’t know about monetization. He didn’t know about production infrastructure. Anyway, this was a positive thing. This is one of the cores of his success. But it also created this kind of very stable group of people around him that had been part of making all the fundamental decisions. And we’re talking about like in that 2010 era, in the IPO era when they’re trying to figure out how is Facebook going to make money. And so those people were part of all the core decisions that then in kind of the 2015, 2016 time frame started to really backfire. And so it gets really difficult to make the changes necessary when the core decisions that were made by the people who are still the loudest voices in the room because it’s just kind of a natural human thing — is it’s very hard to go back and say, oh, wow, I have to change my mind, but situation changes. There’s a big difference between a startup that is trying to compete. It is trying to make money versus the dominant media and speech platform on the planet. There’s a lot of difference.

kara swisher

Right, right. So did they understand the power they had? Because one of the things that’s always struck me was this is a group people who had enormous power that denied that they had it, using words like we and community, when in fact they controlled all the levers of power and the decision making in their company.

alex stamos

Yeah. I mean, I think certainly people on the Sheryl Sandberg side of the house had that understanding.

kara swisher

This is Sheryl Sandberg, who’s the C.O.O.

alex stamos

Right. But those were the people who are really making decisions. That’s another kind of core issue inside of Facebook. It’s really two companies that are glued together. There’s the engineering and product side, which report to people that report directly to Mark. And then there’s everything else. And everything else reports to Sheryl, effectively. And to put it a little simply — and this is oversimplified, but I think it is still relevant. The stuff that Mark really cares about, he has reported to him. The stuff that he just wants taking care, of he has reporting to Sheryl. And so that includes all the making of money. Like, this is one of the funny things as I see people talk about decisions of Facebook being made because of money. And there certainly are decisions. But Mark himself is, in some ways, incredibly resistant to using actual financial numbers in his decision making process. And so there’s this whole ecosystem of people whose entire job it is to take the decisions he makes and to figure out how to make it profitable. Now, they turn out to be really good at that. And the decisions he makes around, OK, we’re going to be — our product’s going to be popular; we’re going to have lots of users; people are going to enjoy using our product— those tend to be the kinds of things that are easy to monetize. But you have this weird situation where the kind of shareholder responsibility is sitting over in the corner, and Mark is really focused on these completely different things.

kara swisher

That’s interesting because that’s why he always seem so high minded when he talks about what he’s making without understand the implications. And so one of my criticisms is the idea of understanding consequences of your decisions and understanding how they iterate through the system. Let’s fast forward to one that’s become really important this week. Facebook reversed their stance on Holocaust denial, now labeling it as hate speech. Before, they were just focused in on things that more clearly led to violence, and now this is much different. And I had a very big interview with him in which he said essentially, Holocaust deniers do not intentionally mean to lie. And they should be allowed a huge berth to say these things. Talk a little bit about what’s happening now from your perspective.

alex stamos

Yeah. So I’ve always found the fact that Holocaust denial wasn’t banned to be a little surprising to me because it’s a very specific decision that actually seemed somewhat inconsistent with all the other hate speech policies the company has had. Part of it is, I think— Holocaust denial of people that argue about free expression — it’s the example everybody uses of the United States versus Europe, of speech that is legally protecting the United States. Holocaust denial is not illegal in the United States. It is illegal in Germany and France and a bunch of other European countries. And so I think part of it might be that the huge cadre of Harvard JDs, who are the people who have all of the arguments here that then take the arguments to Mark. If you’re doing a free speech class, this is one of the things you learn about. And so somewhat it makes me wonder of whether because it’s such an argued over area that the company got kind of stuck because it actually is one of those areas that free expression advocates have been stuck on for a very long period of time.

kara swisher

Mm-hmm. So how do you think that decision was made to change it? He shifted his stance. And he said his stance has evolved. How does that happen within the company because they’re very big issues for the users and for the world at large?

alex stamos

So there’s this big team called content policy. They report up through the policy team. So one of the issues you and I have talked about is — I believed we talked about this on the stage in Toronto. Remember when people flew places?

kara swisher

Yes, yeah.

alex stamos

That was like a lifetime ago.

kara swisher

[LAUGHS]

alex stamos

One of the core issues is I think that function needs to be divorced from the rest of the policy team because you have to be really careful about having your government relations people be too close to your policy decision making people. But there is this big team their entire job is to try to figure out what’s allowed on the platform. And then there’s all these specialists who specialize in different areas. And there’s a team that works on hate speech and incitement to violence. And I expect what happened was they found that, as they’re chasing the people who are trying to incite to violence, when they’re finding people who are straight up saying, I want you to do something bad, and they’re trying to either organize conspiracies, or they’re trying to put ideas in people’s heads; when they looked at the progression of those people that they would see in their timeline, that Holocaust denial is a common factor, which I think makes a lot of sense, right? If you’re talking about white supremacists and really bad anti-Semites, they’re going to touch upon Holocaust melting point. And the standing Facebook policy was a problem in trying to fight the people who were really bad. So this is effectively what they’ve hinted at. It’s talking about we have more data. And so I expect this kind of is a decision that Mark made. And I expect because your interview and the pushback you got, that made it less likely that he was going to change his mind.

kara swisher

Really? Why?

alex stamos

I think he’s too stubborn on some of this stuff. I think he gets a lot of public criticism, and it will freeze him in place. And so I expect that the fact that he’s been pushed on this specific issue so much perhaps makes him careful to touch it. I think the other thing he’s afraid of is — I remember that interview with him. I was on vacation. And just like any normal person, I’m catching up with my podcast on vacation. And I was losing that interview of you and him. And I think part of it too is he doesn’t want his personal background to feed into the decisions they make, and that that makes him more resistant to this specific issue, and that all of the outside criticism kind of froze in him in place, which is, I don’t think is a good thing, but I expect that, though.

kara swisher

So it took data for him to understand the links between anti-Semitic remarks like this, which were the Holocaust deniers, and the white supremacists who are already violating rules on the platform.

alex stamos

Right. That’s just my expectation. I don’t have any specific insight here. But I expect that they brought him data of the people that we have taken down for saying much more violent things and direct incitement, for which we know there’s a link to human harm, that there’s a bunch of Holocaust denial, and that we need to take care of these root issues, because this is the other thing that I think has changed in Facebook is since 2016 there’s been much more focus on kind of systematically trying to figure out how do you nip these things in the bud? And that was one of the real failures going into 2016 from a policy perspective is everything was about individual pieces of content. So it’s like you just got this one post. Is this one post bad enough hate speech, and not thinking, OK, well, what has this person done so far? Or is this post part of a group where everybody in the group is saying things? And so in the context of the group, it’s clear that something that is borderline is actually really hateful. The other areas around like the Russian ads and stuff, right — is this specific piece of content violating, or is it part of a government’s attempt to manipulate the platform?

kara swisher

So it’s confusing to people when they go after individual posts because it doesn’t create a systemic way to deal with these issues. Why was it chosen to pick individual things, which would be like playing whack a troll, essentially versus a systemic kind of solution for these kind of Things

alex stamos

Yeah. And I think that’s just kind of the Silicon Valley way overall is that all content moderation almost always comes out of responding to specific events. It is extraordinarily rare to have a tech company start up, and from the beginning, think, oh, well, these are the negative potential impacts of our product and let’s think through it. This is just a blind spot we have in tech overall is it always takes that first scandal, the first bad thing to happen, before these kinds of components are built into the company and that these policies are created. And I think that the structure of that is kind of deeply embedded in how Facebook does this work of something bad happens; now let’s have a team of people, a couple of lawyers, a bunch of data scientists, dive into it and put together a 24-slide deck with all of this data of the thing. And it’s all responsive and reactive. It’s very rarely proactive.

kara swisher

OK. Why is it done that way?

alex stamos

I think the companies are always kind of reactive because it’s both legitimately hard to predict what the emergent properties are of millions or billions of people interact with one another. But also because some of these are really long-held issues where nobody has come up with a good model of how you want to regulate the speech. And I think part of it is the unique position these companies are in; in that kind of all of our historical arguments about speech is usually about the government controlling it, because traditionally the only people who have had this power have been governments. And I think that’s actually one of the fundamental issues here is that these companies are acting in a quasi governmental manner in a way that, for which we have no good precedent. And so they’re not like The New York Times op-ed page where there’s a lot of op-ed pages. So if The Times decides not to publish your op-ed, you can go down the street to The Post. If that doesn’t work, you can do pamphlets on the street. Whereas, some of these companies are so big. And specifically like in this Facebook and YouTube are probably the ones that are like the really dominant of owning their spaces. They’re so big that there’s not a ton of options if you want to reach a large audience in some of these ways. And so when they make a decision, it has the power that a government would have had decades ago.

kara swisher

So you’re making an argument that they’re too big to fail us, or that they fail us because they’re too big.

alex stamos

No. I’m making the argument that the policy decisions that we want them to make are somewhere between the private decisions of what goes in a newspaper and the power of a government to censor. And there’s this new thing for which we don’t have a good framework. And I think, first, when we go back to the Holocaust denial, the arguments about Holocaust denial have never been, is The New York Times required to have Holocaust denial op-ed. Nobody’s ever made that argument. The arguments have all been, should the government have the power to silence these people? And now you have these companies that are somewhere in between. And they’re vacillating back and forth between we’re acting as a government, and therefore we need to be kind of default open, and we should only stop speech that we know is specifically harmful; and more of the newspaper model of we need to shape this community. And so if you’re shaping the community, your rules are about making a community something, then you have a responsibility. If you’re a platform for speech, you’re much more like a government.

kara swisher

Well, except you’re not a government. You’re a private company that makes billions of dollars and become the wealthiest people on the planet because I think part of your argument is sort of like the guns don’t kill people; people kill people. But guns, actually— the tools and how you make guns do kill people, more people, or how you make them, how you distribute them. How do you push back the idea that these are corporations that can make these decisions? They are private platforms, and they are not the government.

alex stamos

They’re not the government. And under our laws, they have a pretty much total first Amendment right to decide what’s on their platforms.

kara swisher

They’re making products that might help amplify this problem, and then they won’t make the decisions to fix them.

alex stamos

It’s on the things that are amplifiers that I think they have the most responsibility. This is where I think we have to be — they are both. They are incredibly powerful. And you’re right, one option here is to break them up. But breaking up the companies does not get you what you want. It gets Marco Rubio or Ted Cruz what they want. This is like one of the fundamental things I find it’s kind of hilarious about the current situation is that you’ve got both Democrats and Republicans saying, I want to break these companies up because I’m angry about their speech policies. But they both think that break ups will benefit them. And it’s probably the Republicans who are right. If you had 10 Facebooks instead of one, you would have probably much less control of speech.

kara swisher

All right. But the idea that they don’t have responsibility, and that they are like government, you really believe that that’s the case.

alex stamos

No, I’m not. You’re putting words in my mouth.

kara swisher

OK.

alex stamos

I think there’s something new. And we don’t have a good — kind of in Western democracy — we don’t have a good model of this. And so what I’m saying is like the arguments that we have that, again, I think too many people making decisions are lawyers because the problem is that the lawyers have been trained on all of the history of kind of speech regulation, and all that history is about government control of speech. It’s very rarely about private actors controlling speech. So there’s something in the middle. But if you look back, if we try to come up with historical parallels, it’s hard to find one where we’d want those parallels to be as aggressive about controlling speech. So I don’t think anybody was arguing that AT&T should listen to every phone conversation and then report certain phone conversations to the government and then cut off ones where people are pro-communism.

kara swisher

Fair point, but it’s not the same thing, is it?

alex stamos

Well, it kind of is. I mean there are parts of Facebook that are like the phone company. There are parts that are like a newspaper. And this is the other problem is that when you say Facebook, or you say YouTube, you say Google, you’re talking about a bunch of different products that have different kinds of amplification and, I think, different levels of responsibility. So the parts that amplify, the parts that connect people together who didn’t know each other, and the parts that allow you to take your speech and to send it out to lots and lots of people, I think the companies have a lot more responsibility than the components where you’re talking in very small groups. That’s much more like the phone company. And I think that the privacy and the free association issues are much more important down at kind of the small end. And so that’s why I think we have to be careful with talking about very specific parts of the product when we talk about these controls.

kara swisher

OK so let’s talk about QAnon. There’s a new Facebook policy where they have banned QAnon. Explain what QAnon is. Yeah — explain that.

alex stamos

OK, yeah. So, well, this is the hard part. What is QAnon? It’s something like closer to a religion or a cult. When I think of QAnon, I think it’s kind of like online Scientology, which is, it’s this set of beliefs that start with the idea that there’s a massive problem of child sex rings around the world, and kind of work your way up to, it’s not just that there are horrible things happen to children, but this is part of a worldwide conspiracy that includes most of the prominent members of the Democratic Party, and that they’re doing things like killing babies and taking certain hormones from their brains; that there’s child sex dungeons inside of buildings. There’s all these really crazy stuff. So I think it’s like Scientology, which starts with, ooh, Tom Cruise looks great for his age. And then five years later, that we’re all like reincarnated aliens. It’s this set of beliefs that start kind of simple and then work people up to really crazy conspiracies.

kara swisher

Right. And so there’s this whole crazy subculture online, but it’s jumped to Facebook because these things do jump onto the bigger platform.

alex stamos

Right. And this is where QAnon is like ISIS in that the core of ISIS was not on Facebook or Twitter. It was always on Telegram and then some other kind of alternative channels that were big in the Middle East. But what would happen is they would decide on Telegram we want to do recruiting. And so they’d create kind of a watered down version of their stuff, like really highly produced videos and such. And they would send people to go to YouTube and Twitter and Facebook to go spread the word with the watered down version. And so that’s what’s it’s like with QAnon that you have kind of the core believers talking about the really, really crazy stuff on these alternative sites. And then they go out, and they’d try to entice people into their world by finding them on the big platforms. So one of the other decisions that Facebook made that ended up being poorly timed with QAnon and some other stuff was after the kind of fake news scandals of 2016. Facebook moved more and more to pushing content that was created by people on the platform itself. So pushing people less to links, which was kind of the core of the fake news crisis— was people leaving Facebook to go to websites that had crazy, crazy stuff. And that really pushed content that was coming from groups. And so lots of people moved into private groups on Facebook. And in those private groups, QAnon got big.

kara swisher

Facebook is calling QAnon an identified militarized social movement.

alex stamos

Right.

kara swisher

So explain to people what they’ve done and what you think it means.

alex stamos

So Facebook has effectively created a new level within this category of organizations they called dangerous organizations, right? So the dangerous organization category comes out of a time when I was at Facebook, and our biggest issue was ISIS. The biggest content policy issue was the Islamic state had professional propagandists to create all this propaganda. And then they had thousands of supporters around the world who would amplify that propaganda all day. And that was to both celebrate the Islamic State. It was also to recruit people to go and fight for them in Iraq and Syria. And so this organization was created inside of Facebook called dangerous organizations that were specifically about understanding terrorist groups and other kind of organized violent groups and then fighting them. What’s happened here is they’ve created now three levels of dangerous organization. So at the top is still dangerous organization like ISIS, organized white supremacist, Ku Klux Klan, stuff like that. There’s this dangerous individual. So those are people who are like violent individuals, and then they have supporters. So this would be like the Christchurch shooter, other people whose names I’m not going to use because I’m not going to give them the airtime, but horrible people who then have folks who support them online. And now they have this third level called militarized social movements. And this is what they’re calling QAnon. And I think the fact that they have to do all these levels demonstrates the challenge that they’re going to have, which is the thing about an ISIS or a white supremacist group is that they actually have both a membership and an ideology that’s definite. So ISIS had a real command and control structure. They also had a very specific ideology that you could hold on to. White supremacists generally are less kind of cohesive. So it’s much more rare for them to say I’m part of a group, but they have a really coherent, horrible ideology. And the white supremacists can’t go 15 minutes without using a racial slur or some anti-Semitic term or something.

kara swisher

So they’re easy to track — you’re saying these two groups are easy to track and easy to eliminate.

alex stamos

Yeah. Because there’s something there. You have to have — if you’re saying, I want to get rid of blank, you have to have something to hold on to. And so in both cases, you can say, I’m going to hold onto, you say you’re part of ISIS. I’m going to hold on to, you say people should travel to Syria to fight for the caliphate. I’m going to hold on to you having horrible white supremacist phrases.

kara swisher

So when you compare a group like QAnon with ISIS — and there are these differences that you’re talking about — should Facebook be treating them like ISIS?

alex stamos

I don’t think this was the appropriate way to take care of it. I think, first off, the core product problem here was group recommendations. And one of the things — it’s hard to make the decision outside because now that I’m on the outside, I don’t have access to that data anymore. And I think one of things that isn’t well understood on the outside is how much these things are doven into and how much data is created on each of these issues before a decision is made. And one of the little things that leaked out was, there was apparently a report that said that of people who joined extremist groups, something like 60% of the people who joined that joined it because of the recommendation algorithm.

kara swisher

Explain what that is.

alex stamos

Yeah. So if you join groups on Facebook, Facebook will recommend other groups to you. So if you’re, again, like in the —

kara swisher

Same thing on YouTube with videos, et cetera.

alex stamos

Yeah. And this is where these companies — we’re talking about where they are between a newspaper and a government. And this is where they have much more responsibility, and they’re much more like a newspaper or television station, and they are choosing, I’m going to highlight this content and put it in front of people who weren’t specifically looking for it.

kara swisher

And then what to do? Treat them like ISIS and pull them down, which they’ve done, or what?

alex stamos

So I think that this was a mistake because I think, one, it’s going to be effectively impossible to wipe out QAnon. So now that Facebook has said we’re wiping out QAnon, they have to do it. And I think that’s going to be almost impossible because, again, there’s no real consistent ideology. And so what’s going to happen is the people behind the, the drops and stuff, are going to manipulate, are going to change the ideology into a way it becomes more and more milquetoast and very difficult to decide I’m going to kick off some legitimate user under their real account that they’ve had for 10 years. I’m going to kick them off because they’re talking about this theory that also is held by Q that isn’t really harmful by itself. I think the real focus should have been on the product stuff. The problem about product changes — if they completely changed the recommendation algorithm worked so that this wasn’t a problem anymore, that takes a long period of time. And there’s also nothing that they can announce. And I think they’re really going for the PR move here of we’re banning QAnon. Were making that announcement, and they can get lots of positive press feedback, and I understand why they’re doing that. But I think it’s got a short-term benefit. And in the long-term, term it’s going to be really difficult, because this is other thing about ISIS. So we were really effective about kicking ISIS off. But part of that was being a part of ISIS is illegal. So being an Islamic terrorist is illegal. And if you were part of those secret telegram channels, and you’re in the West, you very well could get arrested. If you were in Syria, you can get drone striked. There are people who were killed by the US government who all they did was IT support effectively for ISIS. None of that’s for QAnon. Being a part of QAnon’s not illegal. And so there’s no kind of pincher move that you can have between the government and the platforms here. So I think they will end up existing forever.

kara swisher

So how do we govern these companies then? They have to deal with all kinds of issues. What would be a solution to this power either holding back this power or dealing with this power?

alex stamos

Yeah. I mean, this is one of the key policy questions of the next five years is, who makes the rules for massive international platforms and who has a voice in that? So the first thing we remember is something like 95% of Facebook users are not Americans. So we can’t just try to come up a solution for Americans by Americans. Whatever we do, we have to think internationally. The second is one of the traditional things people said is just regulate them, regulate them, regulate them. So the United States, we don’t regulate speech like this. So in the U.S., we won’t pass these laws. We have to be really careful about just supporting government regulation because a lot of the horrible things that happen around the world are because of governments. And so we have to have a strategy that does not say local law controls speech all over the world, and that Silicon Valley will become the handmaidens of oppressors all over the world. I think the Facebook oversight board is a reasonable first step. The crazy thing —

kara swisher

They haven’t met yet.

alex stamos

forever. If you’re going to — the fact that they have not had a big meeting before the U.S. election is insane to me. If you’re going to do this —

kara swisher

This is a board that’s supposed to determine case by case issues. Again, a slow roll as far as I’m concerned.

alex stamos

And I don’t know if that’s just the operational script for Facebook, or if there’s a fundamental argument inside of teams in Facebook, and that one of them is successfully slowing this down because they don’t want their power taken away. But that’s the first example I can think of one of the companies saying we’re going to, one, publicly argue these things, but also give up some power over our own content policies. So I think that’s a reasonable start in that at least we’ll start to get transparency. Like this QAnon decision, I think instead of it being made in Mark Zuckerberg’s conference room and then turned into a press release with no data, what I would have rather seen is three weeks of public Zoom calls where we’re watching an argument over this and people from the dangerous organizations or get up there, and they show us the data. Here’s the 24 slides, and here’s the data of how QAnon is using Facebook; how the algorithm is additioning it. Here are your different options, and then having the review board argue over the different options. I think even if you disagreed with the output, at least you would see how the sausage is being made, and people would have some feeling that there’s a little bit of process here instead of these arbitrary decisions being made. The company is going to have to move to something like that because all they’re doing right now is demonstrating that if you yell at us enough, we will change our minds, and we’ll do so in a way that is not based upon any kind of logical theory.

[music]

- (SINGING) Sway.

kara swisher

We’ll be right back.

[music]

- (SINGING) Sway.

kara swisher

One of the things that you did late at Facebook was the counterterrorism team and dealing with all these issues, working with the F.B.I. and law enforcement agencies and things like that. And one of the things that you dealt with them was around election integrity. And I don’t want to use the word hacking. What would you have called it?

alex stamos

Yeah. So the term we use now is influence operations. And that’s the term for governments that run teams that are trying to manipulate the political environment in other countries. So one of the things that I was responsible for is after the U.S. election, I built a team that looked into what happened and what different activity was on the platform. And we found a couple of different Russian operations that we announced first in the spring and then in the summer of 2017. Since then, that team that we first built in 2015 has become really big at Facebook, and a bunch of other companies have built that. So inside of Google, Twitter, there are now teams that think about influence operations, and whose job it is to find groups who are trying to manipulate the platforms in an organized manner. Most of them are state-related, although there are some now private actors. And I think that’s actually one of the interesting things that’s been happening is now you have private groups that are being hired to do this kind of work.

kara swisher

Yeah. So that’s influenced operations. You’re talking about propaganda.

alex stamos

Yes, propaganda. Yeah. And there’s a long history of this. There’s a great book called “Active Measures” by Thomas Rid, which has the whole history of kind of the KGB doing this with flyers and student Communist Party groups and stuff like that. And effectively, that’s all gone online.

kara swisher

And now in Facebook, specifically, because it’s the biggest, and it’s also the most porous, it’d feel like, you have a perfect place to do this, to do these influence operations.

alex stamos

Yes and no. So on the foreign influence operations, I think the companies are doing a pretty good job, and Facebook is leading this. Like, on the foreign groups, that has been a huge focus. And the, quote unquote, “nice thing” about a foreign operation is it’s really easy to come up with a rule that you just say, look, you can’t have people in St. Petersburg messing with an election in Germany or the United States. That’s an easy content policy to have.

kara swisher

And this was the famous Internet Research Agency? Is that correct, the name of it?

alex stamos

That was the big player in 2016, was the Russian Internet Research Agency, which is a private organization owned by an oligarch close to Putin. What’s happened since then is there’s been a massive explosion of this kind of work. And so this is what I’m doing now is I run a team called the Stanford Internet Observatory at Stanford University. And we have a bunch of researchers looking into this kind of activity around the world. And what we found is that Russian-style influence operations happens all over the place. It happens in Africa. It happens in Asia. Most of its domestic. So most of the time these operations are not between countries, but are inside of countries by the ruling party or powerful political interests, although there still is obviously some foreign stuff going on. And this brings us to the U.S. election. The companies have responded very forcefully to the foreign influence. They’ve had the backing of the U.S. government on this. So one of the big differences is now there’s this massive task force between the tech companies and the U.S. government to do this kind of work. But what has happened is most of these issues have moved into the domestic space. And the vast majority of the trolling that we saw from Russians is now being done by American groups. And that makes it much more complicated.

kara swisher

On lots of issues like misinformation on Covid or the election or whatever.

alex stamos

Yeah, on lots of issues. But specifically, what I’m concerned about is the election. And what we’re working on right now is the election. And the vast majority disinformation about the election is definitely from domestic groups.

kara swisher

All right. So talk to me about what you are doing. And what’s going on inside of Facebook? Because each tech companies had a different response. Twitter has stopped political advertising. Facebook is allowing it. There is a big controversy over allowing politicians to lie. And still, others think that most of the action is about political content, not the ads themselves.

alex stamos

So, yes, there’s different things happening here. So for our part, we are part of this coalition that we put together called the Election Integrity Partnership. That’s Eipartnership.net for your listeners who want to see our stuff. We’re with the University of Washington’s Center for Informed Public, the DFR lab, which is part of the Atlantic Council and a private company called Graphica. And these four institutions, we are working together to spot election disinformation, to take reports from trusted reporters on the outside, organizations like Common Cause and the like. We are also taking reports from local election officials. So if you’re a county election official, and you see online disinformation, you report it to this thing called the Election Integrity-ISAC, which is a semi-government body that’s funded by the Department of Homeland Security, and then they send it to us. And we go and investigate election disinformation. We find what platforms they’re on. We match it up with the platform policies, and then we reported the platforms for them to action it. And then we write up our results if it’s a big enough issue. And so we have a bunch of these write ups on our website.

kara swisher

So aren’t you sort of doing their job for them? Isn’t this the job of these platforms to be doing this kind of stuff?

alex stamos

Right. And so a bunch of them are doing this. Facebook has people working in this area. I would guess Twitter and Google have people working on these things. There’s a couple of things going on. One, the companies care about their own platform. So if you report something to Twitter, Twitter will take care of it on themselves. The problem is that, just like with QAnon, no disinformation stays one platform anymore. So if you see it on Twitter, it’s going to be on Facebook. If you see on Facebook, it might be on TikTok. It might be in Reddit. It might be on Pinterest. And one of the interesting things has happened is you have three companies — Google, Twitter, Facebook — that have really invested in this area. And then you have hundreds of other companies that either can’t or won’t. And so one of the things that we’re doing is if we find something on a big platform, is we try to find it everywhere else and report it to the companies that don’t have the capabilities to build big teams. The second thing we’re trying to do is we’re trying to bring transparency to this. So if you report it to Twitter, they’re just to going to take care of it, and then you’ll never hear about it, right? If you report to Facebook, they’ll quietly take care of it. They’re not going to announce it. And after this is all done, we’d like to have these four institutions write a report of this is the kind of disinformation that’s happened during the election, and these are the policy responses.

kara swisher

Right. And so what are their preference policies going into the election? Can you break it down very quickly for people who aren’t following it because it’s super confusing?

alex stamos

It’s super confusing. We actually have a huge blog post on this that we keep on updating because they keep on changing their policies as we get closer and closer to the election.

kara swisher

Great. Sounds organized. [LAUGHS]

alex stamos

Well, in some cases, I think that’s not appropriate because they should have figured this stuff out six months ago. In some cases, it’s OK because they’re reacting to current events. So if you look, the way we think about it, there are a bunch of different categories of election interference. There’s situations we call procedural interference where you’re trying to confuse people. You’re telling people, if you sign your envelope, it’s illegal. Or they’ve changed all the ballots, so throw away your absentee ballot. There’s participation interference. We’re trying to tell people not to vote. So an example of that would be on election day you say, Antifa started a riot outside of the polling places in Miami. Don’t go vote because Antifa’s there. There’s fraud where actually asking people to do fraud. And then especially important this year are two new ones that we’ve been tracking, and that the companies have had to create new policies for. And one —

kara swisher

Ah, which fresh hell is happening to us.

alex stamos

Yeah, fresh hell. And so the two new ones are, one, trying to undermine people’s confidence in the election in the belief it was stolen. Now, this is a really hard one to come up with policies because —

kara swisher

That’s Trump’s Twitter account doing that.

alex stamos

Effectively Trump Twitter account.

kara swisher

[LAUGHS]

alex stamos

And this is a top down problem in that you’ll have Trump and Fox mostly set the agenda for this. And then that will trickle down to all these blue check Conservative influencers who then get millions and millions of views for their content. And then you have lots and lots of other people who will that amplify it and then we’ll add fake evidence. And so you have this whole ecosystem work. From the top they are setting this is the direction we are going with this information. And then that’s implemented by all the soldiers at the bottom level. And then the second category that we’ve just added to our comparison that we think is really critical is calls for people to physically go to polling places. And this is something that people have gotten really worried about after the Trump-Biden debate where Trump was talking about the proud boys should stand back and stand by, and other things he has said that effectively is trying to pull people to do what he calls poll watching. But standing with a gun outside of a polling place is really election suppression. And those kinds of calls to go to the polls is the other thing that we’re tracking now, and that the companies have just to put in some policies around.

kara swisher

So what policies could they put in because they are the vehicles for this? They are carrying this information everywhere.

alex stamos

They are. And they’re taking it down in some cases. And in other cases, they’re not. I mean, one way or another, we’re not going to completely eliminate this. So the only thing we can ask for is for reasonable policies, and then for them to have a very tight process of finding and taking things down before they go viral. But again, the real problem here is it’s kind of these verified accounts with large followings, not just the president himself, but the next tier down is what’s really dangerous. So you’ve got the Don Juniors, these kind of Conservative influencers who have millions of followers and who are— they don’t get the coverage the president does. So there’s often I think less pushback on them when they say some of these things because it doesn’t make the jump into the mainstream media. But the people they’re trying to reach absolutely hear their message.

kara swisher

All right. But then here you have the platforms handing them these tools to do so. I mean, I hate to get back to that, but they’re beating off problems that they created themselves through their own tools.

alex stamos

Well, I think in the internet era, you have to separate out what is an internet problem from a platform problem. And I think —

kara swisher

Or a human problem, is what you’re saying. An internet problem versus a platform problem. OK, do that. That’s kind of geeky, but go ahead.

alex stamos

So there are human problems that are amplified by the internet. And they’re specific amplification by platforms. But yeah, I mean, these — if you’re going to allow — we have to have rules more than just people we don’t like can’t have speech. And that’s actually I think where I differ from kind of what you might call The New York Times’ consensus on the platforms is, we can’t just say we’re going to make people better through control of their speech. And if you’re going to enforce a rule, the rule has to be very specific, and you have to be willing for that rule to be enforced on your own side. And I think that’s one of the challenges that we have to deal with now is that there’s a lot of great things that come out of people having the ability to have online speech that has never existed before. And there’s a lot of great societal change that’s coming out of that. And if we’re going to have rules we have to have rules that those good things are going to live within as well. And so around the election, I think we can have very specific rules about disinformation, about lying, about things that are going to discourage people from voting, about things that might cause violence. And those rules should be enforced really aggressively. But that’s different than saying the people on the other side should just be deplatformed. And I think that’s a really dangerous direction that a lot of people are going. And I don’t think that ends well —

kara swisher

Well, what happens though — look what happened with Alex Jones who was doing the Sandy Hook lies. It’s a persistent breaking of rules. When you’re talking about people saying go to the polls with your gun, and then you get rid of that information, it’s not speech I hate. It’s speech that’s something different.

alex stamos

And that’s the kind of speech that we have to enforce against. And I think there are situations where there’s kind of blue checkmark influencers who have very large followings, who are not punished like they should be. That if people are saying things that might end in violence, that they should be off for 30 days. They should be off for 60 days.

kara swisher

So why don’t the platforms do that?

alex stamos

This is where a structural problem becomes a big deal, which is structurally, the people who are making content policy are way too close to the government relations people. And in some cases, like at Facebook, you’ll have a VP who owns both of those things. And so having like a person whose job it is both to control political speech in the run-up to a Democratic election, who also needs to keep whoever the winner is happy, I think is a completely bad model. If you’re going to make these decisions, either you have to do so like openly and publicly via some kind of content review model where people on the outside are making it and we can see this sausage getting made, or you have to wall off the people who are making the content decisions so they are not vulnerable to lobbying by the winners. And that’s the real problem here, is I think it’s all about power now. And you’re right, the possibility that you have a Democratic White House and a full Democratic Congress is probably changing the decisions they are making, which is not good. The speech decisions of 300 some million Americans should not be decided by who is leading in the polls. And so whether or not you like the decisions are being made, the process here is extremely scary.

kara swisher

Right. It’s all about the power and not about the good decisions.

alex stamos

This is about power, and that you have to be — in cases where power is being used — to then manipulate these platforms that are incredibly powerful intermediaries. Then we have to be very careful to make sure any complaints about them are really about the action that is being complained about and not a side move to try to get them to change. Kind of the Republican argument is that Conservatives are being oppressed. And so they come up with these arguments of we’re being suppressed but then have no evidence. Unfortunately, Democrats have fallen into this trap because Democrats don’t want to be seen as pro-tech. They want to be seen as anti big tech as well. And so they’ve gone along with things like Senate hearings, right? And I think a number of Democrats agreed to for there to be a hearing on October 28. They walked right into a trap that Republicans set in its snapshot, which is right before the election, they’re going to pull the C.E.O.s up and yell at them about interfering on the Liberal side. And this is something that I really think is important because I think the media keeps on screwing this up. You can’t just say, we don’t like tech companies, and therefore we’re going to applaud every situation in which somebody says a tech company is bad because a lot of people saying that, they don’t actually believe the same thing you’re believing. They’re just trying to manipulate them into changing the rules so they can say whatever they want. And this is something that’s going to have to change under the next administration is we’re going to have to regulate these companies based upon the actual actions not based upon completely separate and orthogonal speech issues, and to try to get them to be on, quote unquote, “our side.”

kara swisher

Yes. That is a fair point and irritating at the same time.

alex stamos

I’m trying to live up to the — yeah. Yeah.

kara swisher

So platforms have stepped in. Facebook’s slower than most, as usual, around Donald Trump and covering up things he says or taking them down. Twitter covers up tweets or flags them in some way. Again, Facebook tends to take them down versus anything else or not. There’s no subtlety involved. But they’ve done that.

alex stamos

Yeah. So you’re right. I think there has not been enough experimentation here of different ways to deal with this. Like you said, Twitter will often — like a Trump tweet that violates their policies, they’ll make you click through a box to get to it. And then most importantly, they turn off a bunch of the amplification.

kara swisher

And this is new.

alex stamos

Right.

kara swisher

This is new, by the way.

alex stamos

Yeah. Most of the stuff is only come out the last couple of months. And effectively, they are things that have been invented just for Trump. Any normal account would be shut off, but these companies do not want to be shutting down the President of the United States for a variety of reasons, both bad and good. I think there are good reasons that we don’t want trillion companies censoring elected leaders of democracies. There is an issue here that is legitimate that has to be argued over. And so they’re trying to find a third way through product design. And I don’t think there’s been enough experimentation here. But I actually like the Twitter model because it takes away the idea that you’re actually censoring somebody. The content’s there. It’s on the historical record. Historians are going to be able to see this tweet. People can see it to criticize it. But it gets rid of the amplification. And it goes to kind of the core of what I was saying before, which is I think the companies have most of the responsibility around amplification. And so we have to come up these subtle interventions where maybe you will have speech to exist, but you don’t give them the benefit of the platform that allows it to reach millions and millions of people easily. And so you’re not censoring, but you’re also just not giving them kind of the capabilities that you’ve built for other purposes.

kara swisher

So Donald Trump, as you said, they invented new things for him because he keeps violating policies constantly. And they don’t want to — and he feels a measure of being invincible, which he should because they don’t do anything about it. Transfer of power concerns, what measures, if you had to guess inside of Facebook and Twitter and places like that, what will they do, or what do you think they’re preparing to do?

alex stamos

Well, I’m glad you asked because this a specific thing that our team has looked at is we’re trying to measure all the platforms against each other in a situation where a candidate claims victory before it’s been determined by a neutral party. So it turns there’s, of the companies we looked at, there’s actually platforms that have rules around that — Facebook, Twitter, and TikTok of all companies. And so for Facebook and Twitter, they have both said that they are going to label, or they’re going to remove claims. So labeling, again, generally doesn’t allow you to reshare stuff. So the labeling isn’t just a label. It’s a label and kind of a down ranking if it hasn’t been called by some kind of authoritative. And they define which is authoritative. TikTok had said that they’re not going to take it down, but they will reduce the visibility. We have not seen any policies from YouTube, Reddit, or any of the other platforms. This is an ecosystem problem. The truth is if Donald Trump gets on the podium of the White House and says, I won, that is going to be on every single TV station in real time. Now, after he says that, you might have talking heads on CNN and other places say, no, he hasn’t won yet. Fox News, we’ll see what they — I mean, I think one of the real interesting questions is what Fox News because that’s going to be a critical question of whether or not you can have 30% the country thinking that the election was stolen if Trump loses. So it is a newsworthy thing. It will be the front page of The New York Times. Trump’s words will be there. They will not be censored, but they’ll be put in context. And so the question is, what is the equivalent platform solution to allow a newsworthy event to exist on the platform, but then for them to add context? And so for them to label I think’s reasonable. We haven’t seen the label yet because this is like such a sui generis thing. It’s going to happen once, or it’s not going to happen. So they haven’t actually demonstrated. I love actually to see the design for it so we can criticize it beforehand. But assuming that it’s really aggressive, then I think it is the equivalent solution that most people in the media who are handling this responsibly will come to the same, right? You’re not going to have a media boycott of Trump’s words. You’re going to have coverage of those words within the context that they’re adding fact.

kara swisher

And what about the people below that you were talking about, the ones that are more proliferating? You’re going to have to look at and select all these same people, correct?

alex stamos

And that’s an interesting question of what do you do. If somebody says, I believe this person won, that’s a tough thing for them to censor. I don’t want them having a rule about that. I think this is a situation where I think you treat that candidate differently because I think we should hold the candidates to a higher level. But if they’re supporters— and so there’s somebody in the middle. And I who you hold at that level are the official surrogates. So a Don Junior or somebody who’s part of the Republican Party or elected officials, you should probably hold that level. But in a democracy, people get to argue, and this is something we just have to kind of live with. The hard part of freedom is other people having it.

kara swisher

[LAUGHS]

alex stamos

Well, that’s the thing. And I think that’s what gets lost in a lot of these discussions is, I would like to have the ability to, if I believe an election was stolen, from me to be able to go out and write a post saying, this is why I think this election was not appropriate. Now, when I do that, it should be based upon fact. It should not be using invented data or some kind of fake video or something like that. But I want to have that ability. And for me to have that ability, a bunch of people I disagree with also have to have that ability.

kara swisher

So what you’re saying, is it’s kind of interesting because when I met Mark Zuckerburg, he called Facebook a utility. If it’s a utility, and a utility, we don’t control — like, utilities are well regulated by governments — what do you do and what can people do to influence policy at these tech companies from the outside? And it can’t be just me yelling about you’re wrong about Holocaust deniers, and perhaps they lead to violence. And is it a lost cause? It feels a little bit like some days, I’m sort of like — someone said, what she would do? And I was like, I don’t know. It’s so big and so massive and so complex that I’m not sure anything besides shutting them all down will work, like just start and starting again. I know that’s not going to happen, but I think that.

alex stamos

Well, I mean, I think if we start again, you end up with the same problems. Again, most of these are internet problems. If the cost of moving data goes to zero, and people are able to communicate with each other across the world, there’s some fundamental things that come out of that. And that’s one of those silly things I hear from people — just shut down the platforms. I go, OK, great. And whatever replaces them, you’re going to have the exact same thing. This is what I see over and over again. every startup ends up with the same trust and safety problems. This is something I’m trying to do. I’m trying to teach a class at Stanford so that 23-year-old, mostly guys don’t graduate from Stanford and start companies where they’re like, oh, I’ll just let people send anonymous photos to each other. What could possibly go wrong?

kara swisher

[LAUGHS]

alex stamos

Well, a gazillion things can go wrong. And so I —

kara swisher

I have been in that room, Alex. I have been in that room.

alex stamos

You’ve been in that room. Yeah.

kara swisher

Yeah.

alex stamos

No. There are good people behind Discord. They want to make a fun chat app for people to do video games with each other and to be able to talk to each other, and then they end up with a massive Nazi problem.

kara swisher

And so humans suck is really the — the humans aren’t good when they have free reign. What? What is the solution?

alex stamos

So this might sound wonky, but I think we have to move some of our criticism from specific criticism of policy decisions towards the structure. And we need structural reform of how these decisions are made. We want to have these companies to have external bodies that are reviewing this stuff. We want to have transparency. And the problem is that almost all the criticism is I want this person I disagree with to be silenced. That’s the vast majority of it. And after the election, when things come down a little bit, we need to kind of look at the structural who’s making these decisions, what power do they have? What transparency do we have on the outside? There are laws that need to passed here around transparency. So I don’t think, again, The US governments are going to pass laws if this is what the rules are on Facebook. But what they can’t do is say that Facebook needs to publish this information about what the rules are. Facebook needs to publish the decisions that are made. Facebook has to have a big data archive of all of the stuff that has been content-moderated, that is accessible to academic researchers under reasonable privacy protections. The interaction here with privacy law is actually pretty complicated, and that’s one of the reasons we’ll need legislation. But those are things they can be legislated. At first, we need these decisions to be made in public and the data to be public because then at least the criticism can be based upon evidence. But in the long run, yeah, it’s going to be tough because they are international companies. And it’s very difficult for us just in America to control the speech of billions people around the world.

kara swisher

All right, Alex, I really appreciate it. You have terrified me.

alex stamos

Thanks, Kara. Have a good one.

kara swisher

Thank you so much.

alex stamos

Bye-bye.

[music]

kara swisher

“Sway” is a production of New York Times Opinion. It’s produced by Nayeema Raza, Heba Elorbany, Matt Kwong, and Vishakha Darbha, edited by Adam Teicholz and Paula Szuchman, with music and sound design by Isaac Jones; fact checking by Kate Sinclair. Special Thanks to Liriel Higa and Kathy Tu. If you’re in a podcast app already, you know how to subscribe to a podcast. So subscribe to this one, if you’re listening on The Times website, and you’re not QAnon and want to get a new episode to Sway delivered to you, download a podcast app like Stitcher or Google Podcasts then search for “Sway” and hit subscribe. We release every Monday and Thursday. Thanks for listening.

[music]

In part one of Sway’s two-part election integrity series, Kara Swisher speaks to Alex Stamos, former Facebook chief information security officer and current director of the Stanford Internet Observatory, about what went wrong in 2016 and what Big Tech can do better in 2020.

[[You can listen to this episode of “Sway” on Apple, Spotify, Stitcher, Amazon Music, Google or wherever you get your podcasts.]

Mr. Stamos — known in Silicon Valley for his willingness to speak truth to power — rose to national prominence when he departed Facebook amid disagreement about the tech giant’s handling of Russian interference in the last presidential election.

As Election Day draws nearer, social media platforms are amending their policies around political advertising, disinformation warnings and moderation of online groups like QAnon. But how do these decisions get made? What do these platforms plan to do if there is a contested presidential election? And whom can we really trust?

(A full transcript of the episode will be available midday on the Times website.)

Image
Credit...Illustration by The New York Times; Photograph by Steve Marcus/Reuters.

Thoughts? Email us at sway@nytimes.com.

“Sway” is produced by Nayeema Raza, Heba Elorbany, Matt Kwong and Vishakha Darbha and edited by Adam Teicholz and Paula Szuchman; fact-checking by Kate Sinclair; music and sound design by Isaac Jones. Special thanks to Liriel Higa and Kathy Tu.

Advertisement

SKIP ADVERTISEMENT