It is an accident of history that so many tech and social media companies are based in the United States. For example, Facebook has several times more users than there are citizens of any nation in the world. Thus, when a company like Facebook sets rules for content moderation of things like hate speech and pornography, it has truly global implications. David Kaye, the United Nations Special Rapporteur on freedom of expression, joins the show to argue that supranational tech companies should adopt supranational standards for content moderation, namely the UN’s Universal Declaration of Human Rights, with hopes that doing so will constrain governments from limiting basic speech rights.
What is the interaction between surveillance and free speech? What is the digital access industry & what role do they play? Is there a tolerable censorship? What is the Google Spain, “right to be forgotten” case? How do we think of democratization of social media platforms? What is the Universal Declaration of Human Rights?
00:06 Paul Matzko: Welcome to Building Tomorrow, a show about the ways tech and innovation are making the world safer and more prosperous. We’re talking today about content moderation, and not just in the United States but across the globe. It’s something that I’m sure you see in your newsfeed on a seemingly weekly or monthly basis, YouTube or Twitter or the other major social media platforms makes the decision to either pull someone from the platform or not pull someone from the platform, there’s a firestorm of controversy about that content moderation decision. This is not a problem that’s going away, this is something that people are going to be angry and concerned about for the near future. So, we decided to bring on someone who’s not just an American expert in content moderation policy but someone who is a global expert. Will Duffield and I are joined by a David Kaye, the United Nations Special Rapporteur on the promotion and protection of the right to freedom of opinion and expression, and the author of Speech Police: The Global Struggle to Govern the Internet. Welcome to the show, David.
01:07 David Kaye: Thanks for having me.
01:08 Paul Matzko: And what is a Rapporteur?
01:09 David Kaye: That’s a good question. It’s a Special Rapporteur, the “Special” is important there. Basically, the UN’s Human Rights Council appointed me as its monitor for freedom of expression around the world, and it’s a position that has been held for about well, since 1993, 1994. I’m the fourth one for the Human Rights Council, and there’s about 50 other Special Rapporteurs handling different areas of human rights around the world, and it’s actually… It’s been called the “crown jewel” of the Human Rights System of the UN, and it is. If you look at the kind of work that we do, we’re really untethered, in many ways, to the restrictions that others have in the Human Rights Council or other Human Rights mechanisms around the world.
01:57 Paul Matzko: And so when you say you are… There’s 50 of you, you’re around the world looking at different areas. For you in particular, it’s freedom of expression. What do you actually do on a day‐to‐day basis?
02:08 David Kaye: Yeah, so on a day‐to‐day basis we communicate directly with governments and increasingly, and it goes to the topic of the book, with companies in terms of their restrictions on freedom of expression. Quite a lot of that, in the day‐to‐day sense, deals with attacks on journalists and journalism, which has increasingly been online attacks, but it’s physical attacks, it’s Jamal Khashoggi. So it’s killings, it’s that kind of thing is our day‐to‐day. That’s sort of the bread‐and‐butter of the work. But I also do thematic reporting to the Human Rights Council once a year, and to the General Assembly once a year. So, I’ve done reports on encryption and anonymity, sort of the first human rights overview of the importance of digital security, protection of sources and whistle‐blowers, all sorts of thematic issues. I’ve done reports on those and my predecessors have also, and then the last thing is I do country visits.
03:09 David Kaye: It’s a funny system, I have to ask a country for an invitation, which is weird. It’s like asking for an invitation to a party. But in order for me to officially report to the Human Rights Council on a particular situation in a state, I need to get an invitation. So, I’ve visited places like Turkey, not long after the attempted coup and the strict repression of journalism and academics and many others.
03:36 Will Duffield: Just gutting the teaching profession there.
03:38 David Kaye: The teaching profession, civil service, civil society. It’s really… It’s just gotten worse. And that was a few years ago that I did that visit. But also Mexico, Japan, which is a free society but has problems related to independent journalism and investigative journalism. So, we do those kinds of things as well, and with freedom of expression and also freedom of opinion everything has been subsumed under that, whether it’s protest or digital rights or journalism, there’s just so much going on, there’s so much attack on those things.
04:16 Will Duffield: And so were you then, are you the first Special Rapporteur of the platform Internet age?
04:24 David Kaye: So my predecessor, a Guatemalan activist named Frank La Rue, really… He led the way in many respects. He did a report in, I think it was 2011 on freedom of expression and the internet. In 2011, social media had already launched, but it wasn’t what it is today, but he really provided some principles, Very interestingly… And I think people probably who are listening to the podcast and also many people here at Cato are interested in issues of surveillance, and the interaction between surveillance and freedom of speech and freedom of opinion. And he did this really groundbreaking report in 2013 that came out, it was to the Human Rights Council. It came out about two weeks before the Snowden revelations. And he provided a kind of survey of concerns about surveillance, and how surveillance can interfere with privacy, which in turn has a… Can have a deep chilling effect on expression. So, he basically gave a legal framework for thinking about that. So in some respects I’m building on his work and also the work of others in civil society. In some governments, Council of Europe, for example, and the book goes into this in some detail, Europe is way ahead of at least public institutions, quite far ahead of American public institutions often in thinking through the rights component of many of these issues.
06:04 David Kaye: I will say, there’s a report on this. So I did a report to the Human Rights Council on what we call the digital access industry, basically. And it’s your CDNs, it’s your telcos and ISPs, it’s… Those actors also play a very big role, and they’re different in certain respects from… I mean, in very important respects, from social media and others at the content layer, because those actors, in order to go in… They have to make a choice to go into a market or to accept a client. So if you are say, Telia, a Scandinavian telco, major telco and they have operations in Central Asia. They have to make a decision, “Do we go into Tajikistan, knowing full well that Tajikistan’s security services use this Russian SORM, this device that basically hoovers up all information from all traffic?” Do they do that? That involves, I think, for a Scandinavian company, a kind of compromise in their values. So they have to make that choice, whereas Facebook, YouTube, Twitter, they’re either available or they’re not, at the will of governments. And that just presents a different setup for them, I think.
07:32 Will Duffield: Certainly. More broadly, in thinking about that Tajikistan example, when we think of the internet as a whole, and its effect on freedom of expression, there are trade‐offs. How do you approach them? How do you see it shaking out? Especially looking around the world, because often here in tech policy in the United States, we think that the internet ends, if not at the United States borders, at least the borders of Europe perhaps. But that’s not the whole story.
08:11 David Kaye: No, it’s not. Small question there. Well, that is a really big part of the book, thematically. And so I… Maybe backing up a little bit, I still have this kind of optimistic belief that the internet, including social media, I would say especially search, has really opened up vistas for people around the world. It has provided access to information that people really had to hunt for, and would be very unlikely to find. So like if we look at the net gains… I like that, net gains, no pun there, [chuckle] the net gains of the last 20 years, I think, are enormous. But what we’ve seen is that there’s kind of darkness there as well. And I’m not talking about the dark web. I do mean this kind of centralization that has given extraordinary power to both private actors and public institutions to restrict that access to information. And I think one of the things we’ve seen is this, it’s not really the techlash, which is I think an American kind of phenomenon right now. But it is… It’s governments that have been, I think, accustomed to seeing their role as policing public space, and losing that control to private… Basically private American companies based in Northern California.
09:42 David Kaye: And so this has been… The trade‐off, generally speaking for a company that wants to go in and provide access to information is… It’s a hard one, and I don’t pretend in the book to have the answer to that, but it is a hard one. Are you… Are you providing such a service that some form of censorship, some form of restriction is tolerable? I mean, we have to go down the list and think specifically about those restrictions, and it varies around the world. And some of… And I guess just to close this part off, my big concern is that in this totally legitimate effort of governments to have some, at least some insight and oversight over this space, that they’re completely overreacting in many, many parts of the world, and causing… Just as surveillance has, causing really deep restrictions on expression that go beyond what I would, and I’m guessing many people here would, consider to be acceptable trade‐offs.
10:45 Will Duffield: You write about the Google Spain, “right to be forgotten” case, saying that it “points to the use of democratic institutions, the tools in place where rule of law prevails in order to restrain the operations of corporations.” Which sounds… Sounds nice. But from my perspective at the end of the day, while the law applies to Google it affects me as an individual. Even here, if you look…
11:12 David Kaye: Totally.
11:13 Will Duffield: Particularly at some of these global take‐down orders we’re moving towards now, and my ability to gather information about the true state of things. So I end up finding myself wanting Google to have a very wide latitude, so that I can track down whether my doctor’s been censured in the past for poor surgical performance.
11:37 David Kaye: I totally agree with that. And my point in talking about the right to be forgotten in the Google Spain case, is less on the merits… ‘Cause I share your concern. I am concerned that… And putting it in the context of European rights and their perception of fundamental rights… Data protection, explicitly in European law, is a fundamental right on par with freedom of expression. So my concern is that… On the merits of the case, is that it tilts much too far in the direction of privacy over expression, so… And we can see that because Google Spain created this rule that Google has to apply, that is basically a rule of relevance. Like, is… If you do a search, and it has to be a name‐based search. If you do a name‐based search and it comes up and it’s “no longer relevant”… And I’m simplifying the standard, then you can ask Google to take it down. And what it does…
12:36 David Kaye: So the first part of it is… To me, that’s interesting, is it started with a lawsuit. It was an actual, a Spanish citizen making use of democratic institutions, in order to put constraints on a foreign company. So, to my mind, that’s… Like that’s a democratic tool. That’s… I like that part of it, right? Many people around the world lack that. They don’t have independent judiciaries. They don’t have the ability to really pressure a company to do something. So it’s an unusual situation. But the other part of it that I use, one of the reasons I use it in the book, is that what happened was the top court in the European Union decided Google Spain… They said, “Here are the rules. Now you, Google, adjudicate.”
13:31 David Kaye: I mean, it didn’t say, “Okay we’re gonna… Every state in the union has to create a system so that somebody can apply to a court that would then order Google to take something down.” Which, that would be a democratically accountable way of doing it. I think in the United States, we might think that’s a little bit nuts, but that’s completely consistent with democratic principles in Europe. And it’s consistent with human rights principles there, as well. So, they didn’t do that. They instead, basically, increased Google’s power to adjudicate what this norm looks like, and that… To me, that is concerning.
14:05 Will Duffield: And you write about that, as well, in the context of NetzDG, the German take‐down law, which again requires a whole host of platforms to be the interpreters and judges, really, of the extent of German hate speech laws.
14:22 David Kaye: Yeah, exactly. I think this is just something that… And I think your point, I think is really important, that this affects American users. So, it affects Americans in at least two ways. So, one is our access to information in Europe, because if you’re doing… Well, maybe it’s… Maybe you have a doctor who comes from, like you say, he comes from Europe, and maybe their medical practice isn’t gonna be taken down, but maybe there’s something else in his life that he considers irrelevant, and he applies to Google and gets something taken down. But to you, and we take patient autonomy really seriously in this country, you wanna know… You wanna know the full picture, and you might not have access to that. So that’s one part of it. The other part of it is that… Because the companies operate at scale, so any rule that might be imposed on them in Europe is… It depends on the rule, but it is certainly within the realm of likelihood, [chuckle] let me say, that they’ll apply that new rule globally. So it turns out that a rule in Europe around hate speech develops into a platform term of service, more or less, because they shift over and say, “Okay, if that’s your rule, we’ll just make that the rule for all of our users.”
15:39 Paul Matzko: Easier. Yeah.
15:40 David Kaye: Much easier. And maybe… Maybe that’s fine, particularly if that principle is broadly in line with thinking about either human rights or freedom of expression, but it still excludes Americans from that process, and that’s I think concerning.
15:55 Will Duffield: I think for me, not being a European, then European democratic decisions aren’t really any more legitimate vis‐a‐vis me than Google just deciding things.
16:07 David Kaye: Yeah. Right, exactly. But this is why I think… I think this, what’s happening in Europe… Again, which I do not see as some dark thing that’s happening. I think they need to be… If they’re heading down a path of regulation, there are smart regulatory tools that they can adopt, and they’re not… They haven’t, by and large been doing that. But in principle I don’t see a problem with it, but the fact that this is more or less absent from the American discussion. So the discussion around breaking up Facebook, for example. And I’m not a competition expert, by any measure. But one of the things that has been striking to me is it’s the flip side of what’s happening in Europe. In the sense that it’s really an American competition discussion, that doesn’t take into account the fact that for Facebook, for example, 85% of their nearly 2.5 billion users are outside the United States. So what impact does the breakup of Facebook have on those users? It might be completely sanguine. We might be totally fine but they’re excluded from that conversation, much as we are excluded from the European conversation. And I think that recognizing that, and then thinking about what kind of principles, if any, we wanna see online is important.
17:28 Paul Matzko: This framing, though, makes me think… Okay, so when Google or Facebook or any of these platforms, when they do it themselves and are encouraged to do it, indeed, by national governments, that is free from democratic norms. Whereas if we shift that to a set of civic institutions, well now it’s democratic. Why is what Google does, or any private company, why is that not democratic? And if it isn’t, why should it be? Do you see where I’m going with that? Where…
18:00 David Kaye: Yeah I do. And maybe I can answer it, and you can tell me if I’m not exactly answering the question. But first off, I think that there’s variation from jurisdiction to jurisdiction. And Nicole Wong, who’s one of the early deciders… There’s a great Jeff Rosen story in the New York Times magazine, like 10 years ago, about Nicole and her team at Google making decisions around speech. So it’s not like this is a new issue. But she pointed out in this in an interview, which I think is correct, is our way of thinking is, the companies also have First Amendment rights. They have the right to decide what the platform looks like, they have the right to decide their brand, basically. So I think that’s fair. And in the American context, in particular, there’s a significant amount of competition. And it’s competition that’s not just platform competition, it’s competition among social media, traditional media, broadcast, radio, podcast. Now, there’s a whole range of things. There’s a different valence to that situation in places where these American platforms really do dominate public space.
19:12 David Kaye: So I’m thinking about places, not just the Myanmars or the Philippines, those kinds of examples, but even in Europe where Google is a very dominant, let’s say, shaper of public space and yet there really is limited accountability for their decisions. And I think just looking at it from the perspective of those governments, and we could keep it with democratic governments, because at least we can assume that there’s some good faith effort there. I mean, not always true. And healthy skepticism for sure, but I think they’re making an effort to say, “This is a company decision, it’s company oftentimes driven by branding or business model, or whatnot. But you’re having such an impact on our civic space. You’re having an impact on, say, our debate around immigration in Germany, and taking in refugees. You’re having a huge impact on our elections, for example, those kind… You’re creating that space, and so we feel the need that this space should be regulated. My concern has been that in that regulatory move, governments again have been giving the companies more power, and not putting themselves in a position of being democratically accountable. So that’s… That’s a very long way of… I hope that’s kind of getting to what you’re asking.
20:35 Will Duffield: Well, and I think you see it on both sides as well. And neither states nor platforms really want to be decision‐makers, when they can avoid it. Looking to the horrific Myanmar example, in that case, you had a state which was attempting to push out genocidal propaganda. But at the same time, Facebook, which didn’t have a lot of local knowledge, reviewers who understood the language, etcetera, seemed very deferential to the government in making take‐down decisions. So, to what extent should states privilege the concerns, or should platforms privilege the concerns of state actors in the content moderation process? And how should that be moderated by platforms’ lack of local knowledge? Because it seems, in cases in which you have more knowledge, you might… These might be places in which you can defer to the state a little bit more, but it’s precisely in the spaces where it’s easiest to defer to the state where you don’t have that local knowledge, that the consequences of doing so can be most disastrous.
21:54 David Kaye: Yeah, that is such a great question. It’s a real insight into the problem that the companies face, or one of the problems that they do face. They have very little insight. Let’s be honest, it’s easy for an American company to have insight into even small‐town America and the kind of content that you might see at the hyper‐local level on Facebook, let’s say. That’s accessible to them and to us, in a cultural, political and legal way.
22:24 Will Duffield: Yeah. You can get the hidden meaning in something.
22:26 David Kaye: Exactly. And it’s not just language, right? So when you think about a place like Myanmar, or as I’ve been trying to raise… I wouldn’t say raise the alarm, but I want the companies to be thinking about a place like Ethiopia right now. So places where there’s been years of repression and suddenly the lid is taken off, and then you have sort of an active media environment, what is the role of the companies and the platforms when that environment turns genocidal, or turns into ethnic conflict? How does a company get access to that, to that real important information? I think of it very loosely, ’cause I don’t mean it really literally, but how do we think about democratization of the platforms? ‘Cause it can’t just be hiring 1000 more language moderators for a country. There’s something deeper about the nature of interactions. And since I think of this as kind of… Colin Powell, before Iraq said, “You break it, you own it.” I guess, which is the… Apparently, is not the Pottery Barn rule. But here the situation is, you know, “You built it, you own it.” The companies built these platforms, and they have a responsibility to deal with these kinds of important problems, and then they… But they create space for problems.
23:47 David Kaye: So in that context, so what did they do about governments that want to manipulate the platform themselves? The companies have developed some, I think, routines of pushback. But the problem is their pushback relies at the end of the day on, “You’re asking us to do something that’s inconsistent with our terms of service.” My proposal, and I’ve been kind of proposing this for a while is, “Why don’t you rest your rules on globally recognized standards that all governments, they’re at a certain level generic, although there’s a lot of jurisprudence around human rights norms that binds all states.” And so part of my pitch to the companies, and pitch in the book is rest your rules on human rights standards, basic standards of freedom of expression. And then at least when you go back to those governments that are saying, “Take this content down because it’s critical of us.” Or when… Or, “Why did you take this content down? It’s our military preserving national security,” or something like that. You can go back to them and say, “Look, this isn’t about a business decision. This is about what we understand to be our responsibility to protect our users, and the publics right now.” Ultimately, a government can say “Sorry, we’re gonna shut you down.” They do have that ultimate tool, but I also do think that the companies have more leverage to do that than they sometimes kind of admit.
25:12 Will Duffield: The other big suggestion in your book is transparency. And you mentioned both decisional transparency and rule‐making transparency. What are the differences between those and what would either of them look like in practice, as applied to these platforms?
25:31 David Kaye: Yeah, I think, so transparency often is… It’s like a mantra, and we need to have content. What does it mean to say that the governments… The companies are more transparent? And I do think that one of the major problems of our debate over the last couple years has been that… Has been basically an information asymmetry, the companies have all of the information about how their rules are adopted. That’s the rule making transparency. They should disclose more about how they adopt rules. And they are like massive bureaucracies, that’s another part. We tend to see from the outside “Oh, they just decided today to adopt a new hate‐speech rule.” Well, it’s part of a bureaucratic process, and I think that part needs to be less opaque than it is now, even more so, the actual enforcement of those rules.
26:26 David Kaye: I mean, really like the Nancy Pelosi video last week was a kind of perfect example of the debate around this, these issues. Because few people really understand, including myself, what is the enforcement history around manipulated content? If we at least knew that, we’d be on a firmer at least public‐debating ground to talk about, “Well, what should the companies do? What does it mean to be consistent?” Right? What does it mean to say, “Well, this manipulated content should be taken down?” Well, are we talking about, is this the first time? And I think that a lot of people responded with really excellent suggestions about technical solutions, and that’s great. That’s like an A plus, there are a lot of A plus answers but to a different question, because I think even before you get to the technical, the companies still are gonna have to make a decision. Do we… Are we going to draw the line at all manipulated video, all manipulated content?
27:35 David Kaye: That seems, first of all, technically very difficult, increasingly difficult, although it’s always a cat‐and‐mouse between creators and the companies. Do we want that? But whatever… Wherever you draw that line you’re gonna have to have a rule, there’s a rule that basically informs the algorithm, so it’s not… It can’t be just a technical, it’s a values‐driven decision, and we can say, it is acceptable at least in the United States to say, “It’s a company, it can decide what it wants to do, it’s not as… ” Countless people have explained to me the companies aren’t bound by the First Amendment, thank you…
28:18 Paul Matzko: [chuckle] If you weren’t aware, Mr. Law Professor. Yeah. [chuckle]
28:18 David Kaye: As governments are. Yeah right, I just learned that. No, but that’s obviously true. But the reason why this is such a big debate is because these companies have massive impact on our public life, there’s no way around that. And so I just, I kind of wanted to in a way ask people, think about this problem. You might come out wherever on the map, and that’s totally fine, but justify it. And understand, like you used the word trade‐offs, understand the trade‐offs that it will take place wherever you draw that line.
28:55 Paul Matzko: I hope you found our interview with David as interesting as we did. Will and I had a few thoughts about the interview, the topics we touched on, things that just didn’t come to us in the moment, that we wanted to talk with each other about here. Now, Will, what was one of your big takeaways that we didn’t… We did touch on a little bit in the interview, but you wanna flesh it out a bit more?
29:14 Will Duffield: So stepping back to kind of a wider, higher level, much of our conversation revolved around the internationality of the internet and the speech it hosts, the fact that it’s governed by an interlocking web of legal systems and state demands, and under‐girding them, different ways of thinking about speech. That’s what informs these different legal systems, different beliefs about its value, the damage potentially it can do. And historically, the internet has been driven by America and a certain Western set of speech norms, which are now increasingly being contested by European authorities who come from a very different tradition surrounding speech, who see it as potentially much more threatening. Or even in the context of a right, as a right to be balanced against others in a broader kind of give‐and‐take. And how the direction this international conversation, or eventual agreement around the norms we should hold vis‐a‐vis speech, how we should feel about speech, and through it, how people use the internet, will really shape the sort of net we end up with.
30:55 Paul Matzko: Do you think that undermines the value of… So, David proposed using the Universal Declaration of Human Rights as this global template, a one‐size‐fits‐all shared set of values. Do you see that national and local norms, the tension between that, even American kind of speech values, do you still see that problem with this kind of…
31:18 Will Duffield: Yeah, I see it on both sides. I think David was right in his claim that at the end of the day, while we might see the UN or human rights‐based speech framework is more restrictive than what we have under the First Amendment, it actually tracks with or isn’t much more restrictive than most contemporary platforms’ policies. However, there are a lot of Americans who think that those policies are too restrictive, and want an Internet governed under a First Amendment standard, or at least the ability to inhabit spaces online governed in that fashion. And on the other side, I think there are places around the world which are far less liberal vis‐a‐vis speech than this UN Declaration on Human Rights, or even the United States, and aren’t going to take kindly or won’t easily accept either an American or a international standard. And some of that has already driven the extent to which we see fragmentation in the internet today, the fact that China has for the most part its own separate internet ecosystem. Now, you can see some of that is protectionism, that it’s not just a speech concern, but how the Chinese state, the communist party approaches the speech of its citizens has had a tremendous impact on how they’ve structured their internet.
33:01 Paul Matzko: Well, it feels like too… To some extent, the horse is out of the barn. There are all these different fracture points. So like one of them is, well, there are authoritarian governments that are just creating hard firewall type, their own walled‐garden internets. We also have the example of unintended consequences like GDPR leading to European internet users not being able to access certain American newspaper websites, because compliance is just not worth it. So in a sense, it’s not as…
33:30 Will Duffield: Well, and that’s inadvertent fragmentation.
33:32 Paul Matzko: So we have inadvertent…
33:32 Will Duffield: The point of the bill isn’t to cut Europeans off from American and local news sources, but…
33:38 Paul Matzko: That’s effect, yeah.
33:39 Will Duffield: Sometime this next year during the primaries, when some candidate behaves foolishly in Iowa, and folks follow a link from the Guardian and can’t quite get to the local story, they’ll realize that it does deprive them of information, but also it just places them further from Americans. We share less cultural context, even on the margin, for that GDPR change.
34:06 Paul Matzko: So we have these different, whether purposeful or unintentional, we have these different fracture points, which are creating “internets”, right, increasingly so.
34:17 Will Duffield: And on the other side, we still see convergence, both are happening at once. Because plenty of Americans are using platforms that have adopted effectively European hate‐speech norms. And increasingly, they see those norms as normal, because it is how their online interactions are governed and perhaps also comports with a certain Marcusian understanding of speech that they buy into, but that in and of itself represents a certain convergence. When American airlines or hotels are bullied into changing their listings vis‐a‐vis Taiwan or Tibet. So I think it’s important to recognize that it’s not just an either/or, both can happen and they can be happening at the same time, which I think strikes us as not… It’s an easy… Easy to see as a binary.
35:18 Paul Matzko: It’s hard to… Any kind of grand prediction about what… Do we revert back to the World Wide Web? So I have a concern that’s, it’s not unrelated, it’s a little bit of a different angle here. So, as I was listening to David talk about the Universal Declaration of Human Rights, it… That… At the time, it was kind of an inchoate concern that I had. But not long after we interviewed him, he retweeted a… Basically the summary of an article that April Glaser, a journalist at Slate, was writing where she calls for broadcast‐era regulations to be applied to the internet, including… She explicitly calls for Fairness Doctrine style regulations and… As well as the public interest standard. And this is something that I do with my research broadcast, mid‐20th century broadcast regulations. And it put a kind of a head on those concerns that I had even at the time, which is that I’m not sure…
36:19 Paul Matzko: I actually think there is a meaningful difference between freedom of speech and freedom of expression. Not in terms of the words themselves. Speech, expression, they’re essentially synonymous, the words. But in the way in which… In how they’re actually employed, there actually can be a meaningful difference. And that is that the difference… Freedom of speech, as it’s expressed in the US Constitution, is a negative protection. Congress shall make no law. So the obligation there is Congress to do nothing, an idea that that promotes freedom of speech, versus freedom of expression which often is employed as a positive right, which can include the idea of not… Of not a negative protection, government not doing something, but government actively doing things to promote speech.
37:07 Paul Matzko: And this actually comes up back during the Fairness Doctrine debates of the 1960s and ‘70s. This is how people, the ACLU for example, backed the Fairness Doctrine. And when people said, “Well, aren’t you saying that… By saying that speech should be balanced, that if a radio station or a television station has a Conservative on, they should also have a Liberal on to balance them out, that you’re tweaking with that station’s freedom of speech, their right to air whoever they want?” And they would say, “Well, yes. But the freedom of speech… We want to promote more speech, and more speech by more diverse people. So therefore we are actually protecting freedom of speech, ’cause we’re gonna provide more and better and different speech.” But you can see how slippery that can become, because then the effect… All my research shows the use of the Fairness Doctrine to punish politically unpopular points of view in the 1960s and ‘70s. So like that’s where that can… That difference being a negative protection, versus a positive right, can be meaningful. And even Kaye himself, when he was tweeting Glaser’s take on public interest standards and Fairness Doctrine, I don’t know if he has that same concern that I would have.
38:18 Will Duffield: So I think a positive/negative right distinction matters, can be very meaningful. Especially when we’re thinking about silencing some, in order to allow others to better speak or speak more comfortably. However, I don’t think you get there in any way from the speech/expression distinction. There’s no reason why limiting the expression of some, in order to allow others to more comfortably express themselves, would flow from freedom of expression as a concept.
38:58 Paul Matzko: The way I’d put is that nothing in the text of the Universal Declaration of Human Rights means that you have to interpret it that way.
39:05 Will Duffield: Even when it comes to pure speech, that can be understood in a positive fashion as well. You look to the First Amendment of the California Constitution, and it includes a positive right to speak on… Speak publicly on matters of, I forget the precise text, but it uses speech, and nevertheless proposes a positive right. So I worry about treating freedom of expression as some kind of weasel‐word that can take us down a dark path. I don’t think there’s anything dangerous about it, and I think it does capture valuable human activities, ways of making ourselves heard or evincing meaning, that aren’t always captured in pure speech. And that pure speech, as well, can lead us to relatively respect more or offer greater protections to certain sorts of written words, perhaps, or recorded speeches, where we might then ignore a painting or dance, or even something that might sound a little more like an expressive outburst than a well thought‐out argument.
40:29 Paul Matzko: I actually, I agree with you. I don’t think whether we… Speech versus expression, the words and what’s literally on the page, is a meaningful difference. Or it can be… I agree that there’s always been a temptation to think of speech very narrowly in American history, which is why even the Founding Fathers put the freedom of press alongside speech to make it clear. Because in Europe, there was this tradition in the 17th century and the early 18th century of government saying, “Okay, okay, okay. You can say whatever you want, you can speak whatever you want, but you can’t print whatever you want. That’s different. On the page, it’s different.”
41:07 Will Duffield: It’s interesting when we think about other proposed controls today on the velocity of speech, because that’s precisely what those were.
41:15 Paul Matzko: Flesh it out for us. Yeah, yeah, yeah.
41:16 Will Duffield: Well, when we think about a limit on bot speech today perhaps, or thinking about breaking up platforms to create fire breaks for misinformation or bad speech, as Glenn Reynolds proposed in a recent book. These are attempts to limit the velocity of speech, the speed at which a given piece of information can travel between persons. And if you think back to prohibitions on printing rather than merely speaking, especially at a time when transport was a bit slower, well it would take a while for an idea to get around. If you had to remember the whole thing, walk around to people you knew, repeat it out, get them to remember it and pass it on to the next fellow that passed. If you could hand a printed pamphlet about, you could all get on the same page, as it were, much more rapidly.
42:08 Paul Matzko: That’s actually really interesting that it’s… Yeah. Restriction on velocity of speech, and then applying that to the digital age, I really like that. So I think that kind of sums up some of our big thoughts in the aftermath of the interview. It is actually the week after we recorded, there was this another big hullabaloo. It broke the day of our recording, but became much a topic of conversation, the Crowder YouTube, whatever. But I don’t think we need to go into litigating the details of that case, but to remind everyone that this is going to continue to happen. Cases like this, disagreements like that about content moderation are very much a part of our digital future. And so I think this was a timely conversation. Thank you for coming on Will, and until next week, be well.
42:58 Will Duffield: Thanks for listening. Building Tomorrow is produced by Tess Terrible. If you enjoyed Building Tomorrow, please subscribe to us on iTunes, or wherever you get your podcasts. If you’d like to learn more about libertarianism, find us on the web at www.libertarianism.org.