E85 -

Should the First Amendment be the sole standard for content moderation on social media platforms?

Hosts
Paul Matzko
Tech & Innovation Editor
Guests

John Samples directs Cato’s Center for Representative Government, which studies campaign finance regulation, delegation of legislative authority, term limits, and the political culture of limited government and the civic virtues necessary for liberty. He is an adjunct professor at Johns Hopkins University. Samples is the author of The Struggle to Limit Government: A Modern Political History and The Fallacy of Campaign Finance Reform. Prior to joining Cato, Samples served eight years as director of Georgetown University Press, and before that, as vice president of the Twentieth Century Fund. He has published scholarly articles in Society, History of Political Thought, and Telos. Samples has also been featured in mainstream publications like USA Today, the New York Times, and the Los Angeles Times. He has appeared on NPR, Fox News Channel, and MSNBC. Samples received his Ph.D. in political science from Rutgers University.

If you’re listening to this show, you’re likely an ardent supporter of the First Amendment. Yet at the same time, you probably wouldn’t want your social media feeds filled with pornography and hate speech; removing such content requires tech companies to engage in content moderation. Are those two values in tension? Can content moderation coexist with free speech? John Samples, who is on Facebook’s independent oversight board, joins us to discuss how he tries to balance his obligation to promoting free speech while giving users the moderation that most of them want.

How far should we protect free speech online?

Transcript

00:04 Paul Matzko: Welcome back to Building Tomorrow, a podcast about tech, innovation, and the future. Today we’re discussing what social media companies should do when the idea of Internet virality suddenly becomes literal, that viral post that your boomer uncle shared on Facebook back in March or April about how Anthony Fauci conspired with Bill Gates to invent COVID-19 so that he get rich off a vaccine. It’s probably the same uncle who gets drunk at family reunions and starts yelling a lot about conspiracy theories. What should Facebook do about that post? I mean, on the one hand, they might have an obligation to remove disinformation about the virus, because in this case, this information can kill people.

00:49 Paul Matzko: At the same time removing posts simply because they are false and not because they’re obscene or call for violence or whatever, well, that can get messy really quickly. Sometimes truth can be in the eye of the beholder. It strikes people as a kind of censorship. So how should Facebook and really any social media company decide what content to remove? My guest today is my colleague at Cato John Samples. And John was recently appointed to Facebook’s independent oversight board. It’s the board that will take on the toughest content moderation cases. That means that John is gonna be one of the voices in the room that will shape the future of Facebook’s approach to free speech issues. And given that there are 2.6 billion Facebook users worldwide, that’s an important role, and it’s one I’m glad to see a committed civil libertarian take on.

[music]

01:45 Paul Matzko: The announcement that you are gonna be part of Facebook’s independent oversight board was in May. I mean, you obviously knew about it before the rest of the world, but that’s right in the middle of the pandemic. How has the pandemic thrown the kind of plan for the roll‐​out of the independent oversight board? Has it thrown a wrench in the works at all?

02:13 John Samples: Now, there was an interesting question here that I believe the idea was before the pandemic hit, was that there was always going to be a lot of work, maybe most of the work done by the oversight board is going to be done online. When we do have meetings, you become completely aware of the problems here because… Or just the time issue, right? It’s often 6 AM or whatever here in the West Coast and then it’s 11 PM at night, and there’s almost a whole day’s difference and all of that. So there’s always gonna be these online challenges, but it’s gonna have to be that ’cause people are spread throughout the world. I think there was gonna be face‐​to‐​face… Going to be face‐​to‐​face meetings a couple of times a year.

03:06 Paul Matzko: So in some sense things did not change there, and I believe almost certainly the role of getting people up to speed run in your role and on processes, all of that was going to be online, even if we hadn’t had… Not all of it, but much of it, it would have been some meeting face‐​to‐​face. But I suspect a lot of them still have been the way we’re doing it. Indeed, when you look at what they’re doing, I don’t think they can put that together in a month or two. It’s a fairly complicated kind of schooling really to learn the processes and what we’re doing. It’s pretty impressive if they did it in a very short time, but I suspect that was always going to be online. So I suspect in the end that in terms of how fast we get to work, the pandemic probably didn’t matter or probably won’t matter much, in itself I don’t think it was delayed, will be my guess.

04:15 Paul Matzko: That’s fitting. The kind of online services from internet provision itself to the content providers hasn’t really been disrupted by the pandemic. It’s the stuff that we haven’t digitized, it’s the bits of the world that haven’t been eaten by software, to use Andresen phrase that those are the parts that have struggled the most in response to the pandemic.

04:41 John Samples: Well, there’s another aspect of this ’cause in our work so far, I think this is no great surprise, but it re‐​emphasized it for me, which is… A crucial part of the success of this enterprise is going to be based… Unfortunately, I mean, I’m not sure am an idea’s guy, but a lot of it has to do with people and their relationship with one another. And some measure the success of this will be based around the idea, how well can this group of people work together, particularly deliberate together? How well can they support free expression, that is a big Facebook value, when they disagree about other things? And I think they can do that better if they know each other better, and it’s just like the seminars at Stanford, there’s a face‐​to‐​face element that is irreplaceable. Now, we’ve done actually pretty well with some small group discussions online. I thought that was actually a pretty good substitute. So I’m pretty confident that we’re doing surprisingly well, but I’m wondering how much… I’m hopeful we get to see each other soon ’cause I think that also has additional element and will contribute to the deliberative element down the line.

06:09 Paul Matzko: Yeah. As you’re talking about the deliberation, maybe I should back up a second and ask you, for our listeners who aren’t familiar with the Facebook’s independent oversight board, what’s the purpose of the board? What are you being tasked to do?

06:29 John Samples: It’s actually, I think will be demanding and challenging, but it’s pretty straightforward, Facebook, and perhaps at the end of the day, all social media do content moderation. They need to do that, and what content moderation means is that they either remove things from the platform or like in Facebook’s case, you could say also the refusal to allow fake accounts is part of that, but there is something that has to be done to content, sometimes it’s removal, sometimes it’s just making the audience smaller. And they were doing that by… Inside the company, they came up over time with the rules, they eventually published the rules and so on. So they were doing it inside, and I think there was a decision or a growing awareness then a decision, that it was gonna be difficult to do that internally, that it was too much power for one company, it was, and that there was going to be a trust issue that is, that people weren’t or were expressing distrust and so on. So that led to thinking, Well, how could we deal with that, how could we gain legitimacy for these inevitable decisions about content Moderation and by the way…

07:53 John Samples: Let me just spend a second on that, particularly on the… Maybe on the libertarian side, maybe other [08:00] ____ might really think, well, why does there have to be content moderation, why can’t we just have no gate keepers, and whatever is on the platform is there? And the answer I think is, it’s a business, and if you let… If you don’t have the ability to remove some content, you’re not going to do the job for your shareholders or your owners as well as you could, because some people are going to be put off by what’s online or they might it’s often the phrases you just won’t feel safe. They will feel like they’re in an environment they don’t wanna be in. So it’s the business part really, that I think makes content moderation inevitable. Then you’ve got to make it legitimate. And that’s where we come in. Facebook went through a lot of trouble spent a lot of money to set up a trust to separate Facebook from the oversight board, and to have certain kinds of qualities that would create a genuine independence, a possibility for being for reasonable people to see the oversight board as separate from Facebook its business interest or whatever.

09:18 John Samples: And so our job is to initially deal with appeals from Facebook take‐​downs, decisions about what kind of expression is on Facebook. And to answer the question, Facebook believes this expression violates the rules, does it? And if it doesn’t, if Facebook makes a mistake about a take‐​down, the company is committed to making our decision binding. So when something’s taken down, we say it shouldn’t have been, it will go back up. There is also a couple of other things. Eventually, decisions about leaving stuff up will be also decided so that if something’s left up that shouldn’t have been, we’ll be able to take it down according to the Facebook community standards. And then if Facebook ask us about policies, we can give our opinions to them. And we may do that also, in some of the decisions, I’m not clear about that, I want to see how that works out, but generally, it’s an idea that Facebook will ask and we will tell and then they have… Those are not binding. That is the policy advice is not binding, but I think they’ll probably take it pretty seriously, I think going through the trouble of asking, and that’s it, that’s really it.

10:45 Paul Matzko: So as we’re talking, I’m also reminded of a common complaint online about social media and the idea of content moderation in total is, well, why can’t there just be a First Amendment standard? And I think you were effectively responding to that question there, that there should be free, completely free Internet Speech, and therefore there should be no content moderation by Facebook. I think that’s going… That line plays well among Libertarians, for a variety of reasons. I think it’s often, they’re not always well‐​intentioned, so how would you appeal specifically to a libertarian heavy internet user, they’re very engaged in social media they’re, I don’t know, maybe they’re [11:39] ____, they’re very online people vops, as I call them, how would you go about saying, look, this is actually consistent with Libertarian principles for companies like Facebook to be content moderating?

11:56 John Samples: I think Libertarians in this regard feel a tension that maybe others don’t feel as strongly. And I also think they are uniquely placed to appreciate why there has to be content moderation. The tension is, on the one hand, we are obviously all committed to the First Amendment, which involves the government, state action, but I think we’re committed beyond the government too, we hold free speech and freedom of expression to be a value. We think it’s a good thing, along with many other kinds of liberty. But on the other hand, probably more than almost everyone that I’ve really come to believe, we support private property. We support property rights and that manifest itself as we think it’s generally the case, not always, but generally the case that the people who are best placed to make decisions about property, about business, about how a business should be run, or the board, the shareholders and also the management.

13:14 John Samples: So on the one hand, businesses are not covered by the First Amendment in the United States, and many of these social media companies are incorporated in United States. So, there’s that open question whether you have an option about how far you protect free speech. And on the other side, there is this question I mentioned before, which is, I’m running this business, I’m trying… I’m really supposed to maximize value for my shareholders, and I can’t do that if everybody can say anything, anytime, anywhere on my platform. So the real question in that regard… So I think the Libertarian strength there is to say, “Well, you know, because of the state action, we don’t want… The Constitution doesn’t go everywhere. It’s about government. It’s about state actions.” And given that, it’s important for businesses to have this… The system of maximizing shareholder value works great in a free market context. So I think that can be seen.

14:30 John Samples: On the other hand, I do think the Libertarian tendency is to say, “Yes, we accept that.” We think in some ways, we’d love to have a First Amendment standard, but we understand why morally, it doesn’t have to be that way and legally. But what we would want is that speech that… To be as free as it can be consistent with running the business, consistent with meeting your obligations to your shareholders. And that’s why I was very concerned last year that… And very happy when Facebook said that their values were what they were, but that the paramount value is voice or freedom of expression. So this is a tension that everyone has to face at the end of the day, in so far as these platforms are in social media or businesses. And the way to resolve it, is to have people who run the businesses that have speeches as protected as it can be consistent with their obligations to their shareholders to make the business work, and I think that can be a long, long way.

15:46 John Samples: I think you can have everything that is valuable, that people want to say, however offensive… It can be offensive, it can be… People might not like it, is what I mean there and so on. But there have to be limits if the government doesn’t have to follow, and that’s written into the system, and really has a lot to do with their being a private sector, actually, a private seer in America, in ways that maybe I didn’t realize when I first started this. But if you didn’t, the First Amendment applied everywhere. Think about the Cato auditorium. We would have to… Anybody would have… You couldn’t keep people off the Cato auditorium, could you? How could you select them on question and answer people, and so on and so forth. So I think that there’s certainly a tension but the resolution is to help a company draw that line in such a way that speech is as broad as it can be, given the need to have a business and have a market economy.

17:08 Paul Matzko: I suppose there’s another way to think of social media upholding a First Amendment standard, and rather than thinking about it as a policy guiding their own content moderation system, thinking of it as… Well, in our interactions with sovereign states, with governments, having an expectation that they treat the platform with the First Amendment standard. I think most users don’t want the government of, whether it’s America or China, telling Facebook how to content moderate their information. So I guess if you look at a little bit different of an angle, there’s a way in which we do want them to apply the First Amendment in their interactions with governments.

18:02 John Samples: Oh, absolutely. What I haven’t mentioned here is the other… I’m glad you mentioned that. The other aspect of this, which is the first… It’s not just users that have or have not a person right, although, that’s most people. So it’s important. It’s also, I think, the case has been made that these companies themselves have a First Amendment right. There’s an editorial right… Think about the editorial rights that newspapers have, that have been established by US courts. There’s an editorial First Amendment element here, freedom of the press, freedom of media in a sense, that is at issue here. We don’t want…

18:46 John Samples: In fact, that’s for me, the strongest element in engaging with the oversight board is, it just strikes me that the great danger… That down the line, we would get regulations from the federal government, probably, that will be very intrusive on the editorial declaration, the way the social media companies, large and small, run their companies that you would get that kind of intrusion that I think also would be a first amendment right, but I really don’t… I think the company could indicate that right, if it came to that, but I really would rather not get there, I’d rather have a private kind of institution. That’s the other libertarian element to this. We think private institutions can do a lot of stuff that many people think they can’t. I think here, really our choices in next few years are going to be a private institution, make this work, or everyone is gonna be concluding, well, we’ve gotta have government, which we really… I’m here to try to head that off before it happens.

20:11 Paul Matzko: Another thing that comes to mind, I’m not sure it’s the kind of thing that’s easy for me to say as someone who’s, I don’t have any kind of relationship with the Facebook Oversight Board. One of the things I’ve found reassuring in the last year, and it’s something that… It’s become more obvious over the past year for those in tech policy, those who watch the sector very closely. A standard answer that we give to folks who say, “Well, look, that might all be true, it might be true that this is private property and Facebook has shareholders that are obliged to think about it’s a business, it’s not the government.” It’s to say, Well, they’ll often say, “It’s so large that it is as if it is a government, it is as if it is a monopoly, therefore we should treat it like an extension of the government.” That can lead to calls for the kind of the extreme end at the nationalization of the internet and of social media, and even on the less extreme, and just saying, Well, that’s why it’s okay to regulate speech in content moderation on social media, even though we wouldn’t want that in other areas.

21:27 Paul Matzko: Our natural response to that as people who respect property rights and have free speech concerns is to say, well, look, there’s a right to exit. There are alternatives to Facebook, there are competitors. If you don’t like what Facebook is doing, take your business elsewhere. I have found that encouraging over the last year, that Facebook is still a very large, very successful global corporation, one of the largest and most successful in the world. But it has shown it has other… Feet of clay might be a strong way of putting it but it’s shown that there’s competition, that people are exiting dominant social media platforms, they’re leaving Twitter for Parler, they’re leaving Facebook for alternatives, a Facebook… It lost… It has a TikTok clone Lasso that just got buried, Lasso, they shut down just, I think a week or two ago, just couldn’t compete. One of the largest social media companies in the world got beat by TikTok.

22:33 Paul Matzko: But that actually should be encouraging. If Facebook messes up, if Facebook is doing a bad job, people will leave. People already have left to some extent. There are being consequences for Facebook’s mistakes in the past. There’s a lack of institutional trust which they’re trying to regain through this board, and the market is punishing them for that. I actually find that encouraging. I probably wouldn’t if I was on the board or a big shareholder or Facebook, but I find it encouraging as an individual internet consumer. What do you think, John?

23:07 John Samples: That’s right, of course, here we’re often in sort of a different set of issues in part, which is the anti‐​trust issued on dominance, market dominance and all of that. But I would come back to the free speech or the speech issue in the following sense. Well, what is it that sort of breaking up or anti‐​trust can do for you on that front end? And I think the problem is that the actual reality of this policy discussion is in terms of really it’s giving leverage to government regulators to really ultimately [23:47] ____ Congress over Facebook or other law, Google too, is in this discussion. It’s a leverage that could be used to… This is my concern totally. Trying to avoid bad outcomes.

24:06 John Samples: Our study of the Federal Communications Commission, I think published early this year, last year, is in fact shows the problems with having government having that kind of leverage. And then you add to it, what we’ve seen just right at the end of May, I guess, the executive order from President Trump, which had been coming for a while. He speaks directly there. He tries to redefine these companies, not as private forums or private entities or private companies, but rather as the modern public square, and of course, the modern public square is covered by the public forum doctrine, by the First Amendment and so on.

25:05 John Samples: But that executive order, you could see just the notions of leverage over what appears over the curation decisions, over the business decisions, and people who own these platforms. I think there is a separate anti‐​trust issue, but I think in rapidly moving down these pathways toward leverage to a government. Now, the thing is, the big background for me, and I should say everything I have said today is really my opinion. I’m being wary in general to say this is Oversight Board and I haven’t talked to my colleagues, I don’t know if they share this opinion or not, but it just seems to me that the big issue is in 2016, and maybe a little before it became apparent that the social media companies were very important to political discussion, and especially discussion around elections, to mobilizing voters, to getting out the vote and to persuading voters to some degree.

26:11 John Samples: And any time that happens, elected officials are gonna park up, ’cause they’re aware that their fate, they’re very concerned about how these platforms or any platform, they were very concerned about television in the 90s… They began in 1960s, in particular 1968, and I see 2016 many ways, as the 1968 for social media. It’s everyone gets interested because this can affect elections and in that context, when you’re thinking of leverage, whether it comes from direct or some kind of other argument about policy, again, I’m wary that’s the liberal element. The liberal element is you don’t want government deciding these questions about what speech can be heard directly or indirectly.

[music]

27:15 Paul Matzko: Now one of the challenges that has faced social media companies, including Facebook, then Twitter, has… It’s been interesting to see them now, I think all of them have started to roll out, they’ve been deliberating on how they should change their policies over the last year or two, and now in 2020 we’re seeing them actually do it. One of these things was, I think all of the big companies have tried to take on Covid‐​19 pandemic disinformation by flagging this tweet may or may not be accurate information about Covid‐​19. Be aware, here’s a reputable source that you can double check this tweet. I think Facebook’s been involved in that somewhat as well. It’s also shown up in Twitter most prominently.

28:10 Paul Matzko: I’m not sure where Facebooks at at this point in this regard, but Twitter is also taken on Donald Trump’s Twitter account and flagged some of his tweets. That’s been a very controversial thing, whether or not… Whether an elected public official should… Whether or not they’re telling the truth. Do media companies have a responsibility to post it because it’s in the public interest, whether or not it’s true? That there’s a special carve out for public figures, and Twitter has decided to flag some of Trump’s tweets. That’s kind of a new thing, that’s really picked up here in 2020. Do you feel like, and this is not just a Facebook question, it’s kind of social media in general, do you think they’ve done enough or too much in regards to disinformation, whether political or medical, I suppose? And what kind of grade would you give these efforts thus far?

29:12 John Samples: Well, the issue hear is it’s entirely possible that later this year, I will find myself on a panel trying to judge Facebook about these matters. And I would say for couple and in general particularly with regard to Facebook, I would say I’m going to pull back from saying anything definite there. I think there’s a lot on the record there about that and people can find that for sure. And the second thing I would say, really goes beyond just a general recusal that I think is a good idea, but because of my position is, it’s become more apparent to me that judging any of it, for anyone, anywhere, judging these kinds of matters really requires indications about details and circumstances that in particular cases and with regard to this, I simply don’t know that, I don’t know enough about those.

30:24 John Samples: And so quite… If I weren’t on this board, I think knowing what I know, I think I should be a little wary about making judgements because I don’t know those circumstances. What I would say is, I think the companies are thinking about this in terms of various, I’d say, typical terms for free speech. Remember John Stuart Mill on Liberty, and Mill announces the only and defends the idea that the only grounds for restricting someone’s liberty is because of harm done to others. And it strikes me that these kinds of arguments about what we’ve been going through for all of these companies or for anyone who is interested in freedom of speech and importance of freedom of speech in a pandemic, are going to be thinking about the Mill’s harm principle and what that might mean.

31:26 John Samples: The other aspect, I would say, and here I’ll say, of course, this has come down on Twitter and has been a Twitter issue is, again, I would point to the interaction of political officials and these social media companies. It can be quite, apart from everything else, it’s just a very difficult situation for people trying to run these companies. And, again, politicians are highly concerned about what appears, how they appear. And I think what we want as people living in a liberal culture, a culture that respects ideas like free speech and freedom of opinion and [32:17] ____ conscience, is we want that separation. And I just think, I would say without giving anyone a grade, this whole year has shown the challenges. And to that extent, if something like the oversight board for Facebook, I imagine there’s some kind of institution that one of these other companies comes up with, it can work. It’s a real good thing.

32:51 John Samples: It would be something that would solve some of these issues that I think we’ve seen across the board here this year. And you can see why people running the companies would like to have someone who would have a legitimate role to play and have a… Really get people’s agreement to their judgements because it’s just you gotta… Getting caught in our American politics or politics in many other countries is a tough thing right now for everybody, and I think that’s where the institution building comes in.

33:33 Paul Matzko: Well, I can imagine, actually thinking about the… I asked you about grading or whatever, but that… You can imagine as these institutions grow and become settled and that includes the Oversight Board, as it becomes a more robust and mature entity, you can imagine essentially a whole apparatus of other institutions that act… That attempt to explain the decisions that the Oversight Board is making to interpret them to even push them in certain directions. I’m thinking here of the ways in which university education, there are when it comes to university free speech issues, there’s a whole network of non‐​profits who look at universities and how they handle the speech rights of their faculty and their students, and literally grade them, like they’re graded for… Purdue Universities is a B‐​minus, and that’s up from a C plus last year. I’m thinking here of the foundation for individual rights and in education, groups like that. And you can imagine in the future, institutions building out that will rate the decisions that the Facebook Oversight Board, and depending on the priors of the group doing the grading or the rating, they’ll get very different results. But that’s actually the sign of a maturing system, maturing set of institutions. So that could be something.

35:05 John Samples: Absolutely, I thought about this before all of this started. I think one crucial role for what you’ve just been talking about, let’s say an institution that… One of the major social media, or in general social media, is just to keep track of where we are. We tend to go in general, in American Politics from one blow up to the other. And so it’s something the real things that happened two or three months ago seem like they happened decades ago, or they quickly get forgotten and you suddenly say to yourself, “Do you remember that? We were all over each other in February about that.”

35:47 John Samples: One role that they play in with this kind of institution is to keep track of everything that’s happened and give you an overall picture of how things are developing and certainly true of the Oversight Board or I suspect any institution like it is that they’re going to want to… Their great advantage possibility is to give consistency to decisions. So the rules become more consistent, now, external, more speech is the issue if that’s not happening, it’s even possible that a board itself might not notice that there were inconsistencies and it’s handling.

36:28 John Samples: And part of the Supreme Court… A part of what law reviews do is show some of the implications that Justice might not have thought about in some of their decisions and so on. I do hope that the thing… Everything works well enough with at as these kinds of issues… These institutions of accountability rise up and make the whole process work better. If those things happen, we will have gotten well down the road toward a functioning system.

37:00 Paul Matzko: There’s an interesting example, it’s a bit obscure it’s as a matter of jurisprudence and kind of legal history, but and most folks aren’t aware of this, but in the 19th century, the overwhelming majority of what we would recognize as jurisprudence… There obviously was a Supreme Court back then, there were federal courts and state courts. But a lot of everyday law was actually decided in alternative court systems, the most prominent of which was ecclesiastical courts. And so a lot of just Protestants in the 19th century America, they didn’t… There’s injunctions in the Bible not to take a fellow Christian to court and they took those pretty seriously.

37:45 Paul Matzko: And so they would go to their church leaders and say, “Hey, we’ve got this disagreement over who paid for what, and we want you to sort it ’cause we don’t… We’re not gonna take this to the court system.” And so over time, there grew up this whole system of ecclesiastical courts in America, imitating courts in other parts of the world that would rule on issues of basically torts, civil liability, would rule on inheritance disagreements, who gets the money from the estate, even on issues like divorce and whatnot. So we had this whole separate, spontaneously organized grassroots‐​generated system of jurisprudence that had to evolve and grow over time. And eventually it got absorbed back into the standard secular legal system.

38:37 Paul Matzko: But in a sense, I kind of see that’s what’s happening here with the Facebook Oversight Board. As these systems have grown, and content moderation is hard, content moderation at scale is hard, and they’re having to develop their own informal court system. It’s not the same as a legal court, but it’s over time, they’re generating this kind of alternative legal system that has different standards than the formal legal system. There is not a First Amendment requirement in the same way there would be in a formal legal system, but nor should there be. And it’s very cool as someone observing or seeing legal history in the making as these alternative jurisprudential systems are generated, grow, mature, and develop into these robust institutions, and it’s neat seeing you being part of that, John.

39:40 John Samples: Oh. I’m very happy to be part of it. I think there are two ways, two broad ways forward, actually, and I appreciate your comments. We don’t really know enough about these kinds of judicial institutions that aren’t official and legal and cohesive, ultimately. I think that’s great to know more about that. The two ways forward though, are one, is that for whatever reason, you get more centralized oversight system. So, the trust for the OSP is set up in such a way that other companies can join it if they want to, they can make use of the services as it were, and then the question is, will they want to? And all of those questions.

40:28 John Samples: So, that’s a model that you would get going one direction. And with the kinds like the American Supreme Court, I think… But there’s also dispersed power and spontaneous order, as you said. So, the other possibility is that you get a variety of different kinds of institutions, and indeed that’s what you have now. We have an oversight board, but we also have content moderators that themselves deal with the outside world as well as the content moderation decisions we have. Smaller companies in which one or two people deal with their content moderation, we have diversity at this point, we have other things technological coming online that I’ve heard about, that will make other kinds of institutions.

41:16 John Samples: I think in that model where you have a lot of… Call it a strong federalism or very decentralized content moderation system, the role of something like Facebook’s Oversight Board is to learn stuff as we go through time and that stuff gets diffused and that’s also in a federalist governance system, that maybe it’s how the lines are drawn around some issues about the limit speech or don’t limit speech, maybe it’s other things in lessons that are learned. Even though Facebook is large and, obviously, will be dealing with a large number of issues, there still can be lessons, I think that will be important because other companies may wish to have different kinds of content moderation that draw the lines differently, that have different institutions that are accountable to different people in different ways. My prejudice going in, is for letting a thousand flowers bloom, and maybe that… Although I always… Have you ever noticed… You use that phrase…

42:32 Paul Matzko: It’s Mao. Right?

42:33 John Samples: It’s Mao Zedong, that general theory is that he let a thousand flowers bloom to clip them. And so, the people who spoke out ended up dead. Right? So in general…

[laughter]

42:46 Paul Matzko: Yeah. That’s right.

42:48 John Samples: So it’s not short… That’s not my wish at all about the… My wish is that you have a lot of diversity, try lots of stuff, it’s the hierarchy [42:55] ____. We’d go out there, and there is spontaneous orders developed that are different from… Not everyone has to be like the oversight board, but it also could be that down the line, for a variety of reasons I don’t see right now, companies will come to the trust and say, “This work is working pretty well, we think it’s better to go with you than to try to do our own thing. We think we wanna sign on to your rules because they’ve worked or they’ve got general legitimacy.” I don’t know what the reasoning will be, but I do know my own starting prejudice is for diversity and not unitary system, but we’ll see. Most important is that it be not come out of Washington or a state capital too, for that matter.

43:45 Paul Matzko: Yeah. That’s very true. Well, and that would lend legitimacy as you have more… If you’ve got that kind of buy‐​in some day from another outfit other than Facebook, well, it only enhances the authority and legitimacy of the oversight board, ’cause I don’t think it’s necessarily a reasonable suspicion, but people are gonna be suspicious that because Facebook set up this independent oversight board that might not truly be independent and so on, and having more buy‐​in down the line would mitigate that. And even if they don’t buy into your specific… To this oversight board, if they just say, “This is working for Facebook, let’s have our own version of that, it’ll be independent, it’ll be set up along similar lines, have pretty similar structures, pretty similar basic priors”, that itself, I think it would be a success.

44:44 John Samples: So, let me just speak to that for a second, and I think I can comment on the question of independence that you mentioned. I think I would say to people listening to this, what I’ve seen so far in my fellow members of the oversight board is far from the question, What does Facebook want us to do? Or whatever our dependence on Facebook, just the contrary. What I’ve seen is people want incipient pride on being independent of Facebook among members of the board, and a sense that our job is our role here, which is, this is a crucial issue of professional role definition. And I see in the early stages, a sense that we’re here to be independent, that’s what we offer to the Facebook and to the world, and it’s our job. And I feel good about that, and I think that’s what in the end of the day what everybody wants, but I do think it’s also what the members are starting out, they’re expecting it from themselves and from the board. So, that’s good. I feel good about that.

46:05 Paul Matzko: That’s encouraging to hear. I gotta tell you that if you… When I Googled “John Samples Facebook Board” an ad popped up from some organization called Accountable Tech. Which is completely non‐​transparent, there was no accountability for who was funding this thing. But it told me to “tell John Samples, don’t be complicit. John Samples and the Facebook Oversight Board, were supposed to tackle vital content issues, but the board has been reduced to a PR prop as Facebook profits off hate and subverts democracy.” [chuckle] They told me to tell you, John, so I’ve told you… [laughter] Have you seen an uptick in angry emails coming your way since the announcement?

46:49 John Samples: You know, I just had to say I was on Facebook, and it’s first, it’s a striking moment, and I’m a fairly obscure individual that just worked away at the Cato Institute, and you’re going through Facebook and suddenly an ad pops up in your timeline that has your name in it. It’s directed to you. [laughter] It’s like, “What have I got myself into here?”

[laughter]

47:14 John Samples: I think what’s missing here is… You know people… First of all, I was sort of like, “You can’t say anything Samples, you favor critical speech”. These people are… ” Sure, they have a right to it. And I think the fact that they were willing to spend money on it tells me that we should pause and consider what they say. The real issue here was you know we haven’t… They’ve got us failing and we haven’t failed because we haven’t done anything, we’re trying to get ready to hear cases that say we are making good progress to hear cases this fall. So they’re judging us unfairly in the sense that, we have to do something before you can decide that we have failed if you want to impose on that they were imposing on this, this idea that you’re supposed to be doing all of this and that right now.

48:06 John Samples: But it was never the case that that was part of what we signed on to do. The way the institutions defined and so on, so I think it’s a bit of an unfair judgement. And what I would reply to them is to say that it’s unfair at this point, we’re getting ready to go, but… What I know of my colleagues is they want what you want, which is they want an independent board that will give us some legitimate and trustworthy content moderation. So that would be my response and we’ll see what others… I do not, I have not gotten any… Cato, my colleagues, maybe you’re like this too, a lot of my colleagues will get more hate mail than I did, and I always wonder when I was failing, was I too…

[laughter]

49:07 John Samples: Did I not make the case very well or that people weren’t just listening to me, was it a failure that I didn’t get in this kind of ugly emails from people or was it? At some point I stopped worrying, but I do… I think it’s sort of a general thing. I don’t… Yeah.

49:25 Paul Matzko: I think so too. Yeah. Yeah. I think they just auto‐​filled like, Oh, you searched for a board name, you know… Facebook board.

49:32 John Samples: Everyone does that.

49:33 Paul Matzko: And you search for John Samples will pop his name in there. I’m sure it was automated, but… No, I’m with you in that category. I don’t get any much hate mail, and in fact, I’m always vaguely insulted that on Twitter when people would do the anti‐​Semitic, like if you say something that is… There was this phase two years ago on Twitter where people are making anti‐​Semitic attacks on journalists, and so the journalists would preempt it by putting the three parenthetical on each side of their name and no one ever accused me of being a shield for the Rothchild’s or something, and so I was like, Am I not being… Fighting hate hard enough. So you and me, John, both. [laughter]

50:24 John Samples: That’s why you’re worried about that. Yeah. I don’t… At the same time, I think this will be an interesting thing for me down the line, and the decisions are tough ones and I hope people keep that in mind…

50:41 Paul Matzko: They’re going to be controversial no matter what.

50:42 John Samples: Right. And so the controversy will lead to some… Those of us who said, talked about offensive speech and tough speech and all of that, people like me, for example, are going to have to live out our faith, I think that… ‘Cause there could be hard things said about all of us about our effort, and I’ve always… Members of Congress would always say they didn’t like ads or negative ads because it was untrue and so on. Well, you know, there may be untrue things said about me down the line, and I’m just… We’ll return to this Paul, if I pay, I hope you’re going to attack me for giving up my principles. If I complain about offensive speech or whatever, a call for a limit on advertising or something like that, I hope you’ll attack me, I’ll deserve it.

51:38 Paul Matzko: That’s great. Well, I’m not gonna hold my breath, but yes. Well, you heard John Samples. He wants us at Building Tomorrow to help keep him accountable to the principles that all who value Civil Liberties share. We’ll check back in with him next year and see how it’s gone. Let me also say why have you with me? Be sure to go check out lib​er​tar​i​an​ism​.org newly redesigned website. It’s not only a fresh coat of paint, though it does look snazzy, it also allows us to do some cool things, connecting topics with each other and highlighting content that is both relevant and incisive. And as usual, till next time. Be well.

[music]

52:21 Paul Matzko: This episode of Building Tomorrow was produced by Landry Ayres for lib​er​tar​i​an​ism​.org. If you’d like to learn more about Libertarianism, check out our online encyclopedia or subscribe to one of our half dozen podcasts.