E238 -

Will Duffield joins us again to discuss Cambridge Analytica and the future of social media.

Will Duffield joins us to discuss Cambridge Analytica and the future of social media. What is Cambridge Analytica? What is Facebook doing with all this data? Should we expect more regulation of online advertising or nationalized social media platforms?

Further Readings/​References:



00:07 Aaron Powell: Welcome to Free Thoughts. I’m Aaron Powell.

00:09 Trevor Burrus: And I’m Trevor Burrus.

00:11 Aaron Powell: And joining us is Will Duffield, he’s a researcher with Cato’s First Amendment Project. Welcome back to Free Thoughts, Will.

00:17 Will Duffield: Good to be back.

00:20 Aaron Powell: What is Cambridge Analytica and what have they done with my data? [chuckle]

00:23 Will Duffield: So Cambridge Analytica is a political consulting firm, and we aren’t perfectly sure what they’ve done with your data. Now, to start, this is a story full of unreliable narrators, has some confusing twists and turns, and a lot of those with access to privileged information in this story have a whole host of incentives to misrepresent what they know. The basic story of the Cambridge Analytica Facebook data scandal is that a researcher named Aleksandr Kogan, back in 2013, used Amazon’s Mechanical Turk surface to hire individuals with Facebook accounts to take a personality test, which under Facebook’s rules at that time also allowed him access to a basic set of information about these test takers’ friends.

01:33 Aaron Powell: This is one of these tests that you sign into the test online with your Facebook account to answer silly questions, and then it posts the results to your Facebook.

01:41 Trevor Burrus: Which Harry Potter, Hogwarts House are you kind of thing?

01:45 Will Duffield: Yes, that sort of thing exactly. This data was then, contrary to Facebook’s terms of service, sold by Aleksandr Kogan to Cambridge Analytica. They may have used some of this data, both information from the test takers and information about the test takers’ friends, to create psychographic profiles used to potentially more effectively target political advertisements.

02:16 Aaron Powell: So, this, as you’ve described it, sounds like much less of a big deal in terms of scope then the headlines have said, so we’re talking about some people five years ago who took a personality test. So, how do we get from that to the hundreds of millions of Facebook users data has been exposed?

02:38 Will Duffield: A couple hundred thousand people took the test and each of them had a few hundred friends. So, initially it was expected that 23 million Facebook users might have had that simpler set of data, the friends’ data, pulled and incorporated in this dataset. There were a number of other tests that Kogan had run around the same time, which is how we get the revised number of potentially 87 million accounts affected.

03:14 Aaron Powell: What kind of data do they have? Was it in this bundle that they managed to gather, both on, so there’s data from the people who took the test, and then there’s data about the friends of the people who took the test. But what’s in that data?

03:27 Will Duffield: So, that will include who your friends are, your age, employment status, some location data. This is what you’ve chosen to post about your location…

03:43 Aaron Powell: Your hometown, or where you went to high school?

03:48 Will Duffield: Not data gleaned from cellphone records.

03:51 Trevor Burrus: Things you like, would that be in the data?

03:53 Will Duffield: Certainly, yes. Pages you’ve liked.

03:57 Trevor Burrus: Yeah. The theory here is that you could… There’s so many advertisers in general you could use that to figure out if you liked some page for, I don’t know, buck hunting enthusiasts, they might wanna advertise some shotgun shells. Actually, you don’t hunt deer with a shotgun. You might advertise some riflery equipment, but that seems pretty innocuous, and Cambridge Analytica was a political consulting firm, they used it for that purpose. And Ted Cruz had used it before Donald Trump did, correct?

04:29 Will Duffield: Yes. Though I believe Ted Cruz’s contract was originally with SCL Group, which is a larger parent firm, which has done much more government and even military work in the past. There’s another firm in the mix under this SCL umbrella called Aggregate IQ, which was more heavily involved in Britain’s Brexit Vote Leave campaign.

05:00 Trevor Burrus: So, I mean, my first reaction is, so what? That was my first… When I heard about this, I said, “Okay, so what?” Is that a good general reaction to have or is there…

05:13 Will Duffield: Always. No matter what the situation is.

05:15 Trevor Burrus: Nah, it’s okay, or is it something that was opening up a discussion that is gonna be with us for a long time?

05:24 Will Duffield: I would say when it comes to this specific instance, how this data scraped by Kogan might have been used in elections, it is a sort of “so what” response. I think a lot more of this has been going on as well that we don’t know about. But this poked above the surface, we’ve latched onto it and because they’re a global firm, they were involved in elections both in the US and in Europe, it’s been easy for all sorts of people to see a niche story that applies to them, within this broader concern. Now, it does presage a sort of concerning state of affairs when it comes to data in general and how it’s used. However, a lot of that contingent concern is also reliant upon your theory of political communication.

06:27 Aaron Powell: You said this was… What was done here was against Facebook’s terms of service. So Facebook would give this data to a third party, in this case, the guy running the quiz, but you weren’t allowed to then pass that data onto yet another party?

06:44 Will Duffield: Yes. And when we talk about giving data, for the most part, it was, it’s the app itself that’s collecting that data. It’s not so much Facebook. Facebook provides the platform for it, sets the rules governing apps in general, what they can pull. But the users are then authorizing the specific application they’re using to gather this data.

07:09 Aaron Powell: So when the user went to take the quiz, they had to click through something that said, “Do you want to share all of your data with this third party?” And they said, “Sure. Yes.”

07:20 Will Duffield: Including the friends’ data.

07:25 Trevor Burrus: So Zuckerberg goes to Congress, who seemed to be quite upset about this.

07:28 Aaron Powell: Everyone flipped out about this.

07:31 Trevor Burrus: And you, I think, watched all that or most of it.

07:35 Will Duffield: Mercifully, yes. [laughter]

07:39 Trevor Burrus: I saw some of the highlight reels going around, but what would be the, if you’re… Having sat through the whole thing, what would be your description of that bizarre event?

07:47 Will Duffield: Ignorant. [chuckle] The most prominent misconception displayed in those hearings seemed to be that when Facebook advertises on someone else’s behalf, the advertiser is, in the minds of many Congresspeople, gaining a tranche of data from Facebook to use for its own advertising purposes. That’s not the case. When you seek to post an advertisement on Facebook, pay them to spread a message, you choose an audience within Facebook, the sort of people who you would like to reach, and then, they, using the data that they have on us, serve that advertisement to these populations.

08:42 Trevor Burrus: So it’s not that… It’s, again, it’s somewhat new in the sense that we have more data about people than ever before, but it’s also just running ads. While, we’ve always run ads. We try to avoid showing ads to people who don’t care about them. That’s pretty difficult.

09:02 Will Duffield: It’s just a more sophisticated version of saying like this television show is popular in the 18 to 34 demographic of men with incomes of certain amounts, so I’m gonna “target” that audience by advertising in front of that TV show.

09:15 Trevor Burrus: Yeah, I remember when I was a kid and watching Saturday morning cartoons and always go to my parents and being like, “Have you seen that commercial for this new play set?” They’re like, “No, we haven’t seen that commercial.” I was like, “How could you not have seen it? It’s on every… Every time. It’s always on.”

09:31 Will Duffield: And if your parents were watching along with you, the money spent to serve an ad to them was rather wasted.

09:36 Trevor Burrus: Yeah.

09:37 Will Duffield: There’s an inefficiency there.

09:40 Aaron Powell: It’s like, why the NFL, for some reason, decides that the only thing that should be advertised on NFL games is pickup trucks and enlisting in the Navy and…

09:52 Trevor Burrus: Budweiser.

09:52 Aaron Powell: Erectile dysfunction.

09:54 Trevor Burrus: Yes, and Budweiser.

09:55 Aaron Powell: None of which are necessarily anything I would consider buying.

10:00 Trevor Burrus: Yeah. So where are we now if this is the nature of the modern world? And people seem to understand that. I think people understand that a lot of data is kept on them by a bunch of corporations. I think they don’t actually care that much, which we could talk about they should. And they use that data and so will politics. But is there more to this in terms of like… Going forward, there’ll just be more data and more specific data, and it could be, “Where were you at Thursday at 6:01 PM?” or whatever. And so does it get creepy at some point?

10:37 Will Duffield: Well, I think there is certainly a sort of civic republican concern that accompanies the use of targeted advertising in politics. If your neighbor is receiving different cola commercials than you, no real reason to care about that. If advertisements are, political advertisements are used, not just to convince, but to inform, and increasingly our electorate lives in different information spaces, we can see a concern there. But that also comes along with the usual individual choices of people to consume different sorts of partisan media. So it isn’t a new concern, but it might be magnified by this.

11:21 Trevor Burrus: And what about Russia? That’s the other one that “manipulated,” I’m putting that in scare quotes, manipulated our political processes via Facebook. What did Russia do?

11:34 Will Duffield: Russia ran fewer political advertisements in doing this in attempting to interfere. They weren’t necessarily what you would think of as an electioneering communication. They were mostly instead designed to provoke partisan or tribal fears, things like sharia law coming to a community near you or, on the other side, discussing police brutality against African‐​Americans. So they don’t fall within what we would think of as political communications, but they were definitely designed to heighten tensions that already existed within the American electorate; and they often aped the style of some the most inflammatory voices in our polity.

12:28 Aaron Powell: I’m trying to understand the privacy concerns here, ’cause clearly the response to this story, it’s clear that a lot of people think we’ve already gone way across the creepy line, and Congress thinks that we’ve gone across the line that upset them in a way that the Stone revelations didn’t. But so bracket the question of when Facebook five years ago was allowing apps to scrape this data, which it sounds like they’ve restricted that…

13:05 Will Duffield: They nixed that in 2014.

13:07 Aaron Powell: Right, so that’s not a thing. So the data is being gathered and aggregated by Facebook, and it’s being kept in Facebook servers. So this is why the hip thing a couple weeks ago was to go to the Facebook page where you could download all your data and then dig through it and be astonished at how much you being on Facebook six hours a day generates. But Facebook, as we said, is not, they’re not sharing that data on an individual basis in any way with outside firms. They’re just saying if they know that you happen to be a certain age and they know that you happen to like certain things and so if someone says, “I want people of a certain age and people who like following things to see this ad,” Facebook will just marry them up. But is the concern then that the data, from a personal privacy issue, that it could get off of Facebook servers, that Facebook could get hacked, or are people concerned by the very nature of this aggregating?

13:56 Will Duffield: That’s a concern, but it does go to the aggregating and it involves Facebook, and even the internet, less than we might expect. The fact of the matter is, there are a host of interactions which create data, which prior to the past decade or so, was very difficult to capture and organize. Your having purchased something at Target, or buying a certain insurance policy, couldn’t be brought together into some kind of cohesive, whole picture of your life. That’s changed now. So a host of things we do on a daily basis, which previously couldn’t be very well tracked, now contribute to a mosaic of our lives, and I think that is uncomfortable for many people.

14:55 Aaron Powell: Do you think that that discomfort, so as a libertarian I am uncomfortable with the notion that that kind of data exists in a silo somewhere. Even if Facebook’s not giving it away, Facebook is still, exists within the territories of states that would love to get their hands on that data often to do bad things to people who are not necessarily fans of that state. And that makes me uncomfortable because this is access to something that looks very much like incredibly pervasive surveillance. But from the private end, is there, Facebook’s just using this to give us targeted ads and I think everyone would admit that, on the whole, more targeted ads are better than less targeted ads just, from the user experience.

15:43 Will Duffield: Yeah, I’m tired of that Cars for Kids commercial. If we could target something more salient to my interests, that would be an improvement.

15:54 Aaron Powell: So what’s the privacy concern, outside of just like, “I feel like this is creepy?” Is there, call it from our libertarian perspective, is there a genuine privacy concern here beyond that states might get this data?

16:09 Will Duffield: I would say there still is, particularly when one’s expectations concerning how this collected data can be used to discipline them, are at odds with the extent of that reality. If you used to have an Ashley Madison account, but didn’t think that that was the sort of thing that a co‐​worker could find out about and use to push you into taking on some assignment at work. Well, finding out that it can, and they will, is discomforting. It alters the ways in which we might comfortably interact with the world around us.

16:55 Trevor Burrus: How much do you think the uproar about both Cambridge Analytica and Russia in this specific, Facebook in Russia, is related to re‐​litigating the 2016 election?

17:08 Will Duffield: Oh, very much so. Now, I don’t think that either the Cambridge Analytica scandal or Russian involvement here, or even work that AQI did on the Brexit campaign, altered the outcomes of those elections. However, the ability for effectively marginalized groups within our polity to use social media writ large probably had a very real impact on the election. So Cambridge Analytica and Russia are used as a proxy for that broader concern that these voices are coming out of the woodwork, and there doesn’t seem to be a very good way to hush them up again.

17:58 Aaron Powell: But what’s the problem with that? What’s the problem with more voices coming out of the woodwork? Is it that…

18:04 Trevor Burrus: They might be Alex Jones. I think that’s what some of them would say.

18:06 Will Duffield: They might be Alex…

18:07 Aaron Powell: Well, and in this case they voted for Trump.

18:08 Trevor Burrus: Yes.

18:09 Aaron Powell: Right. But these are… We’ve always had candidates that bought ads all the time and we don’t have a problem with that. And political parties buy ads all the time and government officials campaign all the time.

18:20 Trevor Burrus: And you’ve had organizations like… What is that guy’s name? David Icke?

18:24 Aaron Powell: Yes. The lizard people.

18:26 Trevor Burrus: The lizard… He’s been putting out stuff forever about that, right?

18:29 Aaron Powell: Right. I guess I just… I have a hard time feeling the weight of the concern that lots of prior marginalized voices are getting their say. I get that a lot of these marginalized voices are crazy, but that just seems to be more a problem of like, why are so many Americans susceptible to believing crazy stuff? But I don’t know that they’re necessarily… These voices are more dangerous than hearing your local politician tell you to do something.

19:00 Will Duffield: Yes. And there’s certainly no just reason for depriving these people of access to contemporary telecommunications technology.

19:12 Trevor Burrus: I think that the issue is… And I’ve talked about it before on Free Thoughts in episodes that we talk about campaign finance. In particular, that if you don’t understand why people disagree with you… And there’s some, we’ll… We can… We’ll get into bubbles and stuff like that, too, but if it is the case that if you have no good story for why someone disagreed with you except for they must be duped by something, and I think there’s some evidence that says that we misunderstand the other side more than we used to, to some extent. And so, therefore, you need to have an explanation for why this thing happened, like Donald Trump’s election. We’re really explaining other people’s political beliefs and trying to come up with some sort of reason why they believe these things. And that’s what’s scary to me, that’s what I realized when I saw this Cambridge Analytica thing, that this is the new campaign finance, this is the new Citizens United discussion. The worry about the corporations was always that they would have so much political power to run ads that they would convince people to vote against those people’s interests, I’m putting that in scare quotes, and for the corporate interests.

20:22 Trevor Burrus: And so we’re afraid of any of these terms you hear, like, “Someone’s corrupting our democracy,” or, “We have Cambridge Analytica corrupting our democracy,” or, “Corporations corrupting our democracy,” and that always makes me just think interesting, ’cause I’m… Okay, corrupting. How is speaking to people corrupting a democracy? And there’s so many implicit premises in that. And I think that’s what we’re getting into with this Facebook stuff.

20:43 Will Duffield: Yeah, where you draw the line as to what is endogenous versus exogenous to one’s democracy tells you a lot about their theory of political communication.

20:55 Trevor Burrus: That’s really good way to put it.

20:55 Will Duffield: The legitimacy of different forms of messaging or messengers.

21:00 Trevor Burrus: There’s some sort of line from influence to brainwash, and it runs through manipulation in the middle. And so you say, “Okay, you’re allowed to influence the electorate, and you can get out there and make your voice heard, but you can’t brainwash them, and manipulation is probably too far, too.” And so I think they would probably put, the people who are afraid of Cambridge Analytica and the future of this, they’d probably put it at manipulation. And they’re probably worried that if the data gets good enough it could go to brainwashing, where they have some sort of algorithm that says, “If you wash your car every Tuesday and you buy kumquats and like to play squash, we just have to show you six things, like a koala bear video, and then a video of Donald Trump, and then this, and then you will… Now vote.” Like it’ll crack your brain like some sort of combination to a safe. And I think that’s very sci‐​fi, but that’s somewhat what people are also worried about.

21:52 Will Duffield: I think it is, but especially in the political realm I feel as though those fears are presently overblown. When it comes to Cambridge Analytica in particular, there’s very little evidence that any of this worked or even was perceived to work by certain campaigns that hired Cambridge Analytica. You look to Cruz’s use of them. He was doing it because they were the Mercer’s data analytics firm. Robert Mercer funded Cambridge Analytica, he also funded a number of GOP campaigns this past election cycle, and one way in which you could signal goodwill towards him and ask for his money was by hiring Cambridge Analytica.

22:40 Will Duffield: When it comes to their actual services, I was I suppose lucky enough to receive a sales pitch from them about a year and a half ago. I’d been working for a cannabis policy magazine in the UK that was looking to begin a legalization campaign, and one of the firms we had in to discuss potentially working on this campaign was Cambridge Analytica. They played up what they could do, a sort of slick monorail salesmanesque pitch but, at the end of the day, it relied on a great many gimmicks. Posting a bounty for, or running a competition for whoever could correctly guess both the score and the two teams involved in the Premier League final. If you want to find potential Brexit voters, seems like a pretty good way to go about it. But it doesn’t speak to the efficacy of your underlying algorithm or use of data, it’s just a clever idea to select for mid‐​40‐​something white men and get them to give you their email address and some other info.

24:01 Aaron Powell: Is there any way that we could measure the efficacy of it? If ultimately what you’re trying to do is influence votes, and votes are not public, we don’t have records that you can look at. We know X number of people from this area voted, and we know how the totals came out, but we don’t know that this guy voted this way unless he tells you. Is it always gonna be just that monorail sales pitch, or would it be possible to say, “Look, we… Through what we’ve done, we moved the popular vote 0.0 whatever percent?”

24:40 Will Duffield: I think it’s difficult. Maybe not impossible in certain cases, but you rarely get the chance to rerun an election while only tweaking one variable. And in order to really drill down into the efficacy of this, you’d need to be able to do that.

25:00 Aaron Powell: I want to get to the government’s response to this, and the proposals that have been put forward to fix this problem. But before we do, this has provoked a lot of soul searching in Silicon Valley, and a lot of talk about privacy. And I don’t know if it’s related to this stuff or if it’s related to the recent changes in European privacy regs, but just over the last several days I’ve gotten notices about our privacy policy from basically every Internet service that I have signed up for. So this stuff is in the air. What are people in Silicon Valley seeing as… So first, does Silicon Valley recognize this as a problem or think this is a problem? It might not be a problem, but do they think it’s a problem? And then what are their self‐​enforced solutions that they imagine?

26:05 Will Duffield: So when it comes to Silicon Valley’s perception of their role in this, I think it’s even broader than simply a data privacy issue, but a growing recognition that they will be treated and must behave as a sort of political actor. Their power has been recognized and now they have a bunch of people lining up at the door asking for various dispensations and threatening different sorts of regulations if they don’t get what they want. So this is an emerging awareness. When it comes to what firms have done in response to this, it’s been fairly robust, particularly when it comes to this Facebook Russia question. Russia used Facebook groups, in many cases, very large pages, to spread their messages, trying to come off as American citizens of different political bents. They’re requiring… In order to run both explicitly political ads, but also just any kind of more general issue ad that falls within a laundry list of political categories, or to run one of these large pages, these are pages with X number of followers.

27:45 Trevor Burrus: And these were all… They were those America great, or things like this, patriotic sounding…

27:52 Will Duffield: Yes.

27:53 Trevor Burrus: Vague, make America wonderful now kind of thing.

27:56 Will Duffield: But going forward, and this is supposed to be rolled out before the 2018 midterms, in order to do any of that you’ll need to verify your identity, that you are an American citizen or have some form of government‐​issued ID. And you’ll actually receive a code in the physical mail that you then punch in online. And then you’ve been verified.

28:20 Trevor Burrus: This is what Facebook has announced or something?

28:22 Will Duffield: Yeah.

28:22 Aaron Powell: Shouldn’t we be flattered and grateful that Russia wants to make America great again?


28:30 Trevor Burrus: They’re just trying to help out. Yeah. I agree. We could use some help. So this is… How recently did they announce this, Facebook?

28:36 Will Duffield: This was pretty recently.

28:39 Trevor Burrus: Pretty recently?

28:41 Will Duffield: It had been in the works for a time. They’re also, and this will be fascinating for all of the DC political wonks, going to keep track, keep a publicly‐​accessible database of all political advertisements run by all campaigns. So you will know, you’ll be able to go on and look at, both what might have been run to you, but also ads designed to target very different sorts of people from yourself. But this will be open and accessible to everyone. It makes sort of sneaky AB ad testing much more difficult. I’m sure some people will be rather frustrated about that. But this is an action the government certainly couldn’t take, but that Facebook as a private company has decided to do.

29:34 Aaron Powell: What about on the broader, just creepy data mining and aggregation level? So one question, I guess, would be is that stuff, is the, “We’re just going to gather data, all of the data about everything that you do on our platform and also on every website that’s interfacing with our platform, and whatever else,” is that necessary to the very business model of these free mega‐​platform sites? Could we have Facebook without creepy Facebook data gathering?

30:15 Will Duffield: You could if you wanted to pay for it. Whether that market exists, I’m not so sure.

30:22 Trevor Burrus: How real is the threat of regulation, do you think? Maybe not even in the near term regulating Facebook or Twitter or something like this, but in the long term to… We’re in the… As you and I have discussed privately, we’re in the baby steps of the Internet and of social media, and the world in 50 years will… I assume social media is not going away, due to the human element of it. Will we have to be constantly aware of calls for regulation? ‘Cause I’m thinking about comparing it to something like the FCC, where we had a fairness doctrine until 1987, on the theory that the limited airwaves, and if you just put one side of the conversation up you had to put the other one because that’s how we sculpted our political speech framework so people could be informed of both sides. Do you think that there’s a possibility that something like that or totally something different could come with the Facebook and other yet to be seen social media?

31:29 Will Duffield: I’m certainly concerned about the threat of regulation going forward. CDA 230 has created a pretty good…

31:37 Trevor Burrus: What is that? You have to define that.

31:38 Will Duffield: And pretty strong… CDA 230 is an element of the Communications Decency Act, and pretty much the only element that survived later judicial review. It prevents platforms or content hosts from being held liable for content posted by others. If you have a hand in making the content, it doesn’t apply to you, but as long as you are just up or downvoting what others create… Someone libels someone in your comments section, they can’t go after you. They can go after the person who posted the libel but you, as a host, are insulated. And it’s really the substrate on which the modern Internet has been built. Without it, you couldn’t run a platform like Facebook, because you’d be sued out of existence. However, well, that’s created a pretty strong presumptive norm in favor of allowing platforms to govern themselves. You are seeing it nibbled at around the edges, particularly in areas in which there’s an ugliness to what may be happening on the Internet.

32:58 Will Duffield: We saw FOSTA passed recently, which carved out a little section cutting away at these 230 protections for sites that are seen to knowingly promote sex trafficking or prostitution. Hard to be against an anti‐​sex trafficking bill but, in its effect, you begin to expose platforms, and particularly up and coming platforms, to liability for things frankly beyond their control. On the advertising front as well, it looks as though Facebook’s move, and Twitter as well, to privately rein in their political advertising markets have, at least immediately, forestalled regulation. But some form of the Honest Ads Act will probably move forward in the future, simply in an attempt to standardize advertising rules between broadcast, print and these digital mediums.

34:01 Trevor Burrus: What does that say, the Honest Ads Act, in a nutshell?

34:04 Will Duffield: It is an effective expansion of existing broadcast regulation to the Internet. Now, whether the power to do that really exists is somewhat questionable, because you aren’t talking about a limited broadcast spectrum anymore. But it does seem as though the political will to put something like that through is there.

34:29 Aaron Powell: What would the effect of that be? On the one hand, we could say if government cracks down it’s gonna cripple social media. The FOSTA regulations meant a lot of sites were shutting down, turning off sections that sex workers used. And so we might say, “Well, this is gonna destroy everything,” but I guess I could see it… It could also push us in the right direction. Not in terms of it’s good that these things are being shut down, but the response to them. So the sex workers launched a federation of Mastodon, which is a decentralized basically Twitter clone that you can run on multiple servers that can talk to each other; it’s a decentralized Twitter, and moved their conversation there. And that’s something that you might be able to shut down one instance of it, but it can pop up somewhere else, and you could build it in such a way that it couldn’t be shut down, that it’s fully distributed. And they’re now back, people are back to talking and there’s hundreds of thousands of people or hundreds of thousands of posts on it. So it might be that the response to the government going after all these centralized services and saying, “We’re gonna regulate you,” is that people go back to decentralized services that can’t be regulated, which I think would be a good thing.

35:50 Will Duffield: Potentially, though, I’m not sure those decentralized services are as robust as we might hope for them to be. That SW list was recently bumped off of Cloudflare’s DDoS protection, which took it down. It may have gotten back up on its feet now.

36:12 Aaron Powell: It’s back up, yeah.

36:14 Will Duffield: Well, that’s good.

36:15 Trevor Burrus: What is the SW list again? Is that a white supremacy thing, or…

36:17 Will Duffield: No, no, no. [chuckle]

36:19 Trevor Burrus: I’m sorry.

36:19 Will Duffield: This is a decentralized social network.

36:22 Aaron Powell: This is the Twitter for sex workers.

36:25 Trevor Burrus: Oh, okay. I thought you said it was called Mastadon.

36:27 Will Duffield: Mastadon is the name of the underlying technology.

36:28 Trevor Burrus: Ah, okay.

36:29 Will Duffield: So anyone can set up a Mastadon instance.

36:31 Trevor Burrus: Okay.

36:32 Will Duffield: It’s like installing Wordpress on your own server, but then it can talk to other instances as well.

36:39 Trevor Burrus: But they’re more vulnerable, as you said.

36:41 Will Duffield: Than we might initially expect. There’s a thought that, “Well, it’s decentralized, so it’s censorship‐​proof.” Eh, you still need the DDoS protection. And I’m sure some of that will be worked out as we go forward. However, the other concern is that this regulation will cement the status of current market‐​dominant platforms. Facebook, YouTube, they’ve got a lot of legal clout. They’ve got huge war chests. They can afford to work under some of this proposed regulation. It’ll be more costly for them, but they can bear that cost. A new startup cannot. When you’re at the three guys in a garage stage, you can’t afford a legal team. And I’m concerned that, as a result of some of this upcoming regulation, the next Facebook may just be strangled in its crib.

37:44 Trevor Burrus: It would be like being stuck with MySpace back when Zuck was building Facebook in the garage. If MySpace could have had these regulations, we might all be still friends with Tom. Was that his name?

37:55 Will Duffield: MySpace Tom?

37:55 Trevor Burrus: Yeah, MySpace Tom, yeah.

38:00 Will Duffield: Yeah, it’s somewhat trite looking back to see proposals to nationalize MySpace, but people made that case then. It was so important that it needed to be nationalized, and had that happened, we would still have it.


38:14 Trevor Burrus: I have seen people say that about Facebook, too.

38:19 Aaron Powell: I mentioned earlier that the disconnect between the way that Congress responds to the revelations of data gathering and mining by American intelligence agencies and Facebook doing it. And a lot of people who are… They rightly are upset about the NSA gathering all of our data, but they’re upset about Facebook too, because they see there’s a lot of like… These are basically the same thing, they’re widespread surveillance and widespread surveillance is bad. Is that the right way to look at it? Conceptually, does it make sense to think of what Facebook is doing as massive surveillance, and should we be worried about it in anything approaching the way that we worry about it when the NSA is doing it?

39:04 Will Duffield: It depends upon what your imagined threat is. If it’s this sort of social discipline, you might be even more concerned about Facebook. However, as far as we know, Facebook data is not used to, in Snowden’s words, “Put warheads on foreheads.” The NSA’s data is. So there is a difference there. However, when it comes to how this Facebook data could be used down the road, well, legally, if the NASA wants it, they’re likely to be able to get access to it. So the mere fact that it has been collected and made legible can be concerning.

39:49 Aaron Powell: Why is this all about Facebook? If Facebook does this stuff, and it gathers massive amount of data about us in order to sell ads and that’s its business model. But the business model of Google is to gather massive amounts of data about us in order to sell ads. Twitter’s business model, they don’t gather quite as much data, but they gather a ton of data about us in order to sell ads. Is there something technically different about Facebook that makes it creepier, or is there something culturally different about Facebook that gets people more upset?

40:23 Will Duffield: I think it’s the latter. When you think about Google, it’s somewhat difficult to connect your use of search or YouTube to Google AdWords or their other advertising properties. When it comes to Facebook, it’s all occurring within the same walled garden. You’re reading your friends’ posts on Facebook, you’re also receiving advertisements from Facebook right alongside them. So it’s easier to think about it as a data harvesting and advertising entity within a social media space. Whereas when we look at Google, the way in which they collect information and then use it to sell ads is more opaque and distributed.

41:14 Trevor Burrus: Of course, we can’t, for the same reason we discussed with My Space, we can’t presume that Facebook will be around forever, and there are networking effects in all these things, but if…

41:24 Will Duffield: But there’s also the intergenerational element.

41:27 Trevor Burrus: That’s true, yeah. Kids don’t use Facebook. It’s what old people do.

41:32 Will Duffield: It’s underappreciated that for the most part, many of us online today all came online at the same time, regardless of our ages. However, as new generations of so‐​called digital natives [laughter] emerge…

41:48 Trevor Burrus: I’m laughing, ’cause there’s this… Going around the office, everyone was using the term, “digital natives.” Some of our colleagues were like, “Digital natives, what is that? Digital natives?” So, Will is a digital native. Are you a digital native?

42:00 Will Duffield: I guess so.

42:00 Aaron Powell: Okay. We’re not.

42:02 Will Duffield: On the border, at least.

42:03 Aaron Powell: Analogue native.

42:04 Will Duffield: But it’s true, most young people don’t want to hang out in the same spaces as their parents. You don’t want your parents to see what you’re up to with your friends. And I think you will see much more age cohort‐​based segregation between platforms going forward.

42:25 Aaron Powell: Why don’t the youth just learn how to use Facebook’s post privacy tools so then they can share their posts only with their friends and their parents can’t see it?

42:35 Will Duffield: And if you’re on there, your mom’s gonna wanna be friends with you.

42:39 Aaron Powell: Well, she be can be friends with you, but you stick her in a group and then you say, “Share with just this group, which is my homies.”

42:46 Trevor Burrus: I won’t speak for 15‐​year‐​olds, but they seem to like more destructive, meaning like Snapchat, right? It goes and then goes away, and they like things that are sort of demonstrating their lives so they can film them and send them out to people. That’s what… I don’t know. That’s what the kids are doing.

43:06 Aaron Powell: But they’re all shifting to messaging.

43:07 Trevor Burrus: That too. So this all brings up… Political communication in the next decades, I mean, we’re not gonna be able to predict what it is, but it will be the main way that people form their attitudes about almost everything. Even the news networks, their median age is, whatever, 60‐​years‐​old or something like that. No 23‐​year‐​old is going to start watching 7 o’clock news in 10 years. They’re gonna get it through all these other things. So this conversation about how political opinion is formed… And the interesting thing is that the more it targets, and going back to my theory of not having a good theory of the other side, the more you’re effectively targeted with just things you already agree with, the more the other side will look completely insane to you. And then this question… I do think that there’s something like a fairness doctrine that will be seriously discussed for social media in the next 20 years. I don’t know if you have any thoughts on that, Will?

44:06 Will Duffield: I really couldn’t hope to effectively comment on the 20‐​year horizon of the Fairness Doctrine. I will, however, point to, I think one of the most interesting facets of this claim that Facebook has been censoring conservatives or other marginalized viewpoints, the fact that these two women, Diamond and Silk, who are political e‐​celebrities, they’ve risen to prominence by streaming themselves discussing Donald Trump, were enough of a political concern to the Republican establishment that today, they were invited to the Hill as witnesses to discuss how they may have been impacted by certain changes to Facebook’s algorithm. That’s a deeply bizarre development and speaks to a digital culture and digital political life that is increasingly eating real‐​world politics.

45:16 Aaron Powell: That conservatives thinking that they are being censored on Facebook thing, I’ve wondered about that, because… So the claim is that our pages, our posts on our Patriot Nation, whatever, page aren’t getting anywhere near the reach that they used to, so we must be censored, but all of this is happening at the same time that Facebook announced that they were basically reducing the reach of all pages. And so this is one of my pet peeves in this space, is that people routinely don’t understand how the technology works, have no idea how the platforms work, stumble across something, and then think… It affects everyone, but they just noticed it for the first time. Maybe, it’s been going on forever, but they just noticed for the first time and they assume that it was directed by some engineer at them. And so then they blow up and it’s like this grand conspiracy. When in fact, it’s like, “No, this was just… We changed our policies for everyone six months ago and you just weren’t paying attention.”

46:23 Trevor Burrus: You think conservatives are particularly bad at that?

46:26 Aaron Powell: I think that, by and large, given the demographics, conservatives, especially of the Maga sort, are probably, considerably less tech‐​savvy. They skew older, who tend to be less tech savvy, they skew to other demographics that are not gonna be as tech‐​savvy. So I think they’re probably not as media or online‐​literate as other groups.

46:52 Trevor Burrus: They also might have more of a persecution complex.

46:55 Will Duffield: Particularly in relation to Bay Area liberals. When the Daily Kos sees that its traffic has tanked after this algorithmic shift, there isn’t really a perceived grievance against them on behalf of those who run and operate these platforms. When it comes to conservatives, there’s already this culture or tribal distance. So I think it’s just easier to impute that animus, whereas it wouldn’t occur to the operator of a liberal page.

47:36 Trevor Burrus: So people who use Facebook, we began this discussion talking about… When we talk about Congress, they don’t seem to understand it or these things. If someone is gonna learn something from this “scandal,” putting that in scare quotes too, about, maybe things they didn’t know about Facebook or… What should they realize is a lesson from this in terms of Facebook and social media, in general?

48:03 Will Duffield: The lesson that I frequently look to when these stories come out is that everything is permanent online. And there, if I can jump for a moment from the broader Facebook Cambridge Analytica political scandal to an incident in the past week involving some blog posts that Joy Reid may or may not have written about a decade ago. They’ve dredged up on the Internet Archive, the Library of Congress had some copies. She’s claimed that they’ve been… Were hacked or manipulated in some sense, but it seems as though she attempted to memory hold this stuff, and as much as we heard in the hearings with Zuckerberg that, “Well, when you delete your page, it’s gone forever,” that may be the case within Facebook servers, but anyone you were friends with might have saved a copy of your page, or saved screenshots of things you’d posted. And the extent to which everything online is permanent. So as long as, someone out there, it could be the smallest actor in the world wants to retain it, I think that’s still underappreciated and will remain underappreciated for a while.

49:35 Aaron Powell: I guess looking forward, we’re sitting in April right now as we record this, it may come out in May, and at the end of the year we have a Congressional election, and then two years later, we’re gonna have a presidential election. And all of this stuff, that if people are still re‐​litigating the 2016 campaign, they’re gonna get pretty hysterical about this stuff when it’s the control of Congress or the next president. What do you think we, being both the American electorate and how we engage with Facebook and Twitter and other things online, and then also policy makers, should do about these concerns before we get to November?

50:24 Will Duffield: I think we ought to keep a very close eye on how these platforms are used coming into the mid‐​terms, because all of the incentives to either misuse them, in the case of Russia, or play up that misuse when it comes to a host of domestic political actors, none of that has changed between 2016 and this coming November. Russia, in particular, will have every incentive to meddle just enough or even claim that it has meddled just enough, that the American government will continue to attempt to hobble its vital tech sector.


51:12 Will Duffield: Those who want to see more regulation of online advertising and even speech, will again have every incentive to play up the impacts of bad speech. So policing these sorts of claims, and even if it’s the cleanest election in history, you’ll see a lot of them, becomes a vitally important democratic duty.

51:40 Aaron Powell: Free Thoughts is produced by Tess Terrible. If you enjoyed today’s show, please rate and review us on iTunes, and if you’d like to learn more about libertarianism find us on the web at wwww​.lib​er​tar​i​an​ism​.org.