In a previous episode, we covered the transformation of China, which is adopting new tech like delivery drones and digital payment years in advance of the US. But being in the forefront of tech adoption has a darker side when it comes to modern surveillance techniques.
In this episode we parse what is hyperbole and what is reality when it comes to the Chinese surveillance state. That includes social credit systems which combine credit scores with social media behavior and personal payment history. Additionally, local governments are developing facial recognition software that can be used with the hundreds of millions of surveillance cameras littering the country to automatically identify wanted criminals or even to embarrass jaywalkers.
Finally, we discuss some of the groups being targeted for surveillance, including journalists and dissidents. But the worst excesses of the surveillance State have been focused on the Uighur people of western China, many of whom have been subjected to religious persecution or even rounded up and sent to concentration camps for re‐education.
How much does the Chinese Government restrict the movement of their citizens? What is Sesame Credit? What is Alibaba? What facial recognition technology does China use? Do law‐abiding people change their behavior in response to surveillance revelations?
[music] [00:00:05] Paul Matzko: Welcome back to Building Tomorrow, a show dedicated to the ways that tech and innovation are making the world a freer, safer, and more prosperous place. As usual, I’m your host, Paul Matzko. With me today in the studio are Matthew Feeney and Will Duffield. Today we’re going to return to a subject that we dedicated an episode to several weeks ago, a place, not even really a subject, a place. A couple weeks ago, we talked about the transformation of China and all the ways in which new tech is being rolled out and adopted en masse in China in ways that might not even happen in the US and for another 5 or 10 years. China’s leap frogging us technologically. That story, that episode was the optimistic case, the ways in which these new techs are going to improve life and economic growth in China. But there is a darker side to that tail. This new technology also conserve the Chinese government, the party, the Communist Party as it seeks to surveil and control its people. Two of these new technologies we’re going to focus on today. One is what’s called the social credit system. The other is new applications of facial recognition technology. Let’s get started with the social credit score system. Will, what’s the party line? I mean that metaphorically as well as literally.
[00:01:24] Will Duffield: Well, I guess our media party line or our popular perception of social credit score systems in China is that they are holistic state systems designed to track the social impact of citizens’ behavior and either reward or punish them for it, and that this is done with a high degree of certainty or efficacy across large segments of the population. Now, when you drill down into it, when you look at what this state social credit score system is or aspires to be by 2020, the planned delivery or rollout date, and what you see on the ground with regard to these trial social credit systems that for the most part have been put into place privately by Chinese firms, there’s a lot of room between the two interpretations or expectations as to what this is.
[00:02:39] Paul Matzko: We do need to get to the, I guess it’s the sesame credit system is the biggest one. We do need to talk about that in a minute. Before we get there, and I think we’re going to complicate that story of a one giant number that determines pretty much everything you can do as a Chinese citizen or consumer, we’re going to mess with that narrative and show how it’s not quite what it sounds like and then they’re definitely not there yet. But before we do, there have been some early signs of a social credit system in place actually working. I think, Matthew, you and I were talking about this earlier about, what was it, something like 15 million Chinese people have been placed on, because of their low score on one of these systems, they have been banned from being allowed to buy plane tickets or to ride trains. How’s that work?
[00:03:37] Matthew Feeney: I think it’s somewhere between 12, 15 million, I’m not sure of the exact figure, but I think that speaks to what a lot of people in China like about the proposed system. China is a country that’s undergone an economic transformation in the last couple of decades that tons of people appearing online, but there’s not a good centralized credit system, and a lot of people are not paying back loans even after courts have required them to do so. Systems like this, systems by which you can use very popular private companies like Alibaba, for example, to help utilize a credit system, is something that I suppose, sympathize with why people might think that’s a good idea, right?
[00:04:24] Will Duffield: I hear you coming out of decades of communism, you don’t have a history of, I guess, a protestant work or debt repayment ethic. It’s not a high trust society.
[00:04:38] Matthew Feeney: Yes, trust is an essential part of any functioning economy. This is an attempt to help do that. The problem is that these people who were put on the blacklist, which at the moment it’s very small because it’s still in the trial period. A lot of what we’ve seen are trial programs that utilize private company data that already exists.
[00:04:59] Will Duffield: It’s desegregated, to some extent.
[00:05:00] Matthew Feeney: Yes, that’s right. The ultimate goal is 2020 universal system. But if you are someone who is on this list, it can be really bad. It will make it difficult to buy train tickets, plane tickets. There’s a, NPR had a podcast recently talking about how, at least in one Chinese city being on this list prompts your ringtone to actually be some sort of siren warning people next to you that you’re bad at repaying debts, and that’s people who are bad at paying debts. More nefariously, I think, is what’s happened to some people who stand up to the government. There’s at least one journalist who’s quite, who I’ve seen reported on, who wrote something the local government didn’t like. They didn’t think that his apology was sufficiently sincere, and he can’t book train tickets. He can’t book travel, and he’s been put in this weird situation where he’s under effective house arrest, thanks to this system. That’s the really worrying application of this system
[00:06:02] Will Duffield: When you’re talking about a country in which internal free movement already isn’t guaranteed, you need there’s an internal passport system, how much of a new imposition is this? If you say now the private social credit trial system that this fellow is a part of or subject to has prevented from booking plane tickets, but before it, the local government being frustrated with him could have prevented him from being allowed to move, period. Is this a new imposition or just sort of stacking one form of movement restriction on top of another?
[00:06:42] Paul Matzko: To clarify, my understanding is this particular system that’s being used to get millions of people from being able to purchase plane tickets and train tickets is based off of a government run pilot system that is for people who have defaulted on debt. They’re debtors who have had court judgment against them. Now, there is a angle here that’s disconcerting, which is the way in which the party will use debt default court lawsuits to basically squeeze out entrepreneurs who have angered local party officials.
[00:07:19] Will Duffield: Really just restricting the movement of debtors seems problematic.
[00:07:24] Paul Matzko: Just to clarify, this is not like the, and we’ll talk about the private credit score systems in a bit. This is a government run system just focused on defaulters, debtors.
[00:07:35] Matthew Feeney: Yes, but I want to make sure that we mentioned what we just discussed, namely the kind of public shaming aspect of this is not reserved to this sort of system. There are places in China where if you attempt to jaywalk that your image live will appear and there’ll be loud condemnation of your behavior over loudspeakers. That’s a very creepy application.
[00:08:03] Will Duffield: Do they have it for bad drivers too? I mean as a pedestrian, I might [unintelligible 00:08:07] but not the jaywalking thing.
[00:08:10] Paul Matzko: One of the articles mentioned, the particular jingle for, it was for the debtors, but they do something similar for the Jaywalkers in some cities. The little jingle was, you’ve defaulted on, I don’t know, your car payment, and you might one day be walking on the street and like on the billboard, see projected your image and it will play on a loop a jingle that goes, “Come, come, look at these debtors. It’s a person who borrows money and doesn’t pay it back.” I don’t know what the tune is. I imagine it must be to like Maroon 5 or some hated band.
[00:08:46] Will Duffield: Do we have any evidence that this is effective? Do we see higher debt repayment rates for it in the places where they’ve rolled it out? Is there data?
[00:08:56] Paul Matzko: On the debt side, I don’t know. On the jaywalking side, there’s been a number of articles where they noted that at least the city officials are claiming that at these targeted intersections where jaywalking was a problem, the rate has fallen, that they did have to train people to respond in the way they wanted. At first people thought like, “Oh, I’m famous, I’m on a screen.” They have to be told, “No, no. You’re being shamed.”
[00:09:16] Will Duffield: That’s exactly how it would work here. We couldn’t do that. That’d be instant. Like east celebrities, they’d put their twitch handle up or something with it.
[00:09:25] Paul Matzko: Yes, but you have this– it’s not the point where you have a single score across multiple domains across the entire country. It’s all kind of pilot programs for targeting jaywalking and then in a particular city or debtors across the whole country. You can see the pieces kind of‐
[00:09:43] Will Duffield: [inaudible 00:09:43] to boil a frog.
[00:09:45] Paul Matzko: You could bring all these pieces together into one aggregated system. It’s arguable here that, and perhaps we should explain that when we talk about social credit systems, and we’re going to talk here about what’s called Sesame credit, which is tied to Alibaba, which is the Chinese version of Amazon, basically. It does actually more than the Amazon in a lot of ways.
[00:10:11] Matthew Feeney: Yes. It’s certainly from a bird’s eye view, calling it the Chinese Amazon, I guess does the job for the purpose of this podcast. I think it’s important for the audience to know that it does more than what we associate Amazon with. I also want to make sure that we point out that on this side of the Pacific Ocean, we do have something like social credits for certain things and services that we use all the time. Your rating on Uber is a version of this, and when you rate your Uber driver, we rate things all the time, Yelp reviews and credit scores‐
[00:10:52] Paul Matzko: Normally, it’s not a social aspect, but it certainly impacts our lives.
[00:10:58] Matthew Feeney: Sure. That’s a slightly, that’s certainly part of it, right? We’re used to things called credit scores, but credits scores will not affect whether you are allowed to buy a plane ticket to fly somewhere. If you have the money, you can buy the ticket, and that’s-
[00:11:14] Paul Matzko: It’s emerging– it’s like what we are used to a fico, which if you look up the acronym, I forget it. It’s just a random group of people of African– there was three guys who put the system back in the 50’s and we call it fico now. Combined the fico, which is all about lending, when you’ve borrowed money, have you repaid it? With customer appreciation programs that reward you for buying from companies that social media tracking. Like are you a shit poster on social media? It’s like combines these three kinds of data into one. So it’d be rewarded.
[00:11:51] Matthew Feeney: Well, it depends who’s setting the policy, right? A private company– If Amazon started doing this, it’d be very few at least libertarian objections, if Amazon wanted to reward people and give them discounts or favors if they bought American products or if they bought patriotic products. A lot of red, white and blue‐
[00:12:08] Will Duffield: It’s why CVS gives you a four foot long receipt.
[00:12:11] Matthew Feeney: There’s no libertarian objection to that per se. The problem is that the Chinese government’s plan for this universal system for 2020 is a lot more intrusive and worrying than an Alibaba credit score, which some people might find creepy, do you want to, a big company like that deciding to reward you if you buy a good patriotic stuff or foreign goods or if you’re buying a bad video games or whatever? That’s-
[00:12:43] Will Duffield: At least privately, that’s softer touches you can get when it comes to‐
[00:12:48] Matthew Feeney: Sure. What the Chinese government has in mind is a little more worrying than that.
[00:12:55] Paul Matzko: It raises the specter of, okay, so you are a dissident, you’re not happy with the treatment of oppressed people groups. You’re complaining about the central government. The central government responds by putting, lowering your social credit score or requiring Alibaba to do so. In which case now, yes, you can’t buy certain products. You’re paying more money. Your cost of living goes up, your ability to move throughout the country goes down. You might not be able to put your kids in private schools like the debtor system means you have to keep your kids in public school systems. The ramifications for your daily experience, your daily life get really bad really quickly.
[00:13:38] Will Duffield: It relies upon a private data gathering capability such that I think it really throws into question the extent to which anyone can use data on a large scale liberally as it were or without illiberal consequence under a single party state or regime. The act of even private collection, knowing that were the state to want it, they would be able to have whatever access they wished, whether that is morally defensible or at least pragmatically something that ought or ought not be done.
[00:14:24] Paul Matzko: I should mention our producer, Tess, had just texted me and said, “Has anyone seen the Black Mirror episode?”
[00:14:32] Matthew Feeney: Yes, yes. Of course.
[00:14:34] Paul Matzko: I think, for our listeners who watched it, that’s the one where there’s like a wedding and her premise is down.
[00:14:42] Matthew Feeney: I hate that I know who, I forget her name, but Ron Howard’s daughter, she’s a very good actress in her own right, but I’m sorry that that’s how I know her. She plays the protagonist in this a near future where a daily interactions of.
[00:14:59] Will Duffield: Bryce Dallas Howard.
[00:15:00] Matthew Feeney: That’s right. Yes. Bryce Dallas Howard. She plays the main character. Throughout the day, interactions are rated. If we had a good podcast, I’d rate it a certain number of stars, but it’s not just that you’re a Barista, all kinds of day to day interactions are rated and you have a score and the nightmare scenario that plays out in this Black Mirror episode is that she’s been invited to a wedding, but thanks to a few things that happened to her unfortunate realities of the day mean that her score goes down and down, which makes it more difficult to rent a car, more difficult to get certain goods. It’s one of the great things about Black Mirror, is it’s sort of this show that is taking a look at the near and conceivable future. Right? It’s not far off what we’re discussing today, but we should keep in mind of course, that this is the Chinese government that wants to plan all this. I have my own creepy concerns about a private system.
[00:16:02] Will Duffield: Even in that Black Mirror episode, there was no coercion as you would see within a state back social credit system. I was frankly surprised that more characters didn’t just walk away and live in a [unintelligible 00:16:16] hut.
[00:16:19] Paul Matzko: [laughs] This is probably more about you, Will.
[00:16:21] Will Duffield: I don’t mind.
[00:16:22] Matthew Feeney: Most people like to be part of society. [laughter]
[00:16:25] Will Duffield: Society where everyone is rating you all the time? That’s terrible.
[00:16:29] Matthew Feeney: I’m going to be pushed pretty far to live in a hut in the woods. Fine for a weekend maybe.
[00:16:35] Paul Matzko: This is the weird thing that‐
[00:16:38] Will Duffield: [unintelligible 00:16:38] just get away from being rated all the time because it’s intrusive and dehumanizing.
[00:16:45] Paul Matzko: I think. We have this that when people hear social credit systems who’ve seen this episode, that’s what they’re thinking. I think the thing to note is that this is– the government is not actually close to rolling out anything like this. That the current experience of most Chinese consumers with Alibaba’s sesame credit has been actually overwhelmingly popular and positive. All the polls that you send a poll to me Will it’s-
[00:17:14] Will Duffield: Yes, it’s [unintelligible 00:17:14]
[00:17:15] Paul Matzko: Part of that something, I think you teased Matthew, which was that you have people– we shouldn’t underestimate the benefits of banking the unbanked. If you want to get ahead– if you want to join the global middle class, it is vital that you have access to credit, and there are people because it’s a country that only in the last generation or two has emerged from essentially a premodern agricultural economy, they don’t have an easy avenue to get onto a bank credit system. Having social credit, which you count transactions, not just loans, because there’s a cart before the horse problem here, which is you don’t have a lending history and you can’t get it because you don’t have a lending history since– A social credit system gives you potential access for hundreds of millions of Chinese consumers to getting on the credit at ride to the global middle class.
[00:18:11] Matthew Feeney: Right. I think what people might be creeped out about is the fact that someone could spend $10 on or the equivalent of $10 on diapers or the same equivalent on hard liquor and it would have a different impact on how you’re viewed by the private company or potentially even the state, whereas you might repay that money regardless of what you spend it on and be equally good to lenders. Right? But it’s the nudging that I think freaks people out and has this dystopian atmosphere around it.
[00:18:51] Paul Matzko: There’s much informal cheating of this system or opportunities for arbitrage. I’m bringing up at the front counter, I give the cashier a dollar and he scans the diapers instead of the beer, even though I’m buying beer, so that it shows up as though I’m responsible and I get discounts and good rewards when in fact I’m getting my beer and being naughty.
[00:19:15] Matthew Feeney: I don’t know, it wouldn’t surprise me, but I haven’t seen data.
[00:19:18] Will Duffield: I would expect just to keep an eye on the greater the rewards the more incentive you have for those sorts of schemes the same way as a yes, if you’re a smart fridge starts selling data to your insurer and as to what you put in.
[00:19:34] Paul Matzko: You would have a fridge that’s connected to the network that you buy all the good foods that rewards you in your social credit score. That will go in your fridges, monitoring what you bought and refilling it.
[00:19:45] Will Duffield: We’re doing that here with health insurance.
[00:19:48] Paul Matzko: Right. Exactly. You have your official fridge that tracks, and then you have a black market fridge that doesn’t track where you put your liquor. Let me get the idea there. There’s people who are always going to find way of gaming a system like that. Well, why don’t we move on to the second part of our new surveillance tech in China, and that’s facial recognition. We mentioned this with the jaywalking ordinances. What that uses is cameras. There are over 200 million, there’ll be 300 million cameras in the next, I think by 2020 surveillance cameras in China. Like every major intersection, street corners, I mean in urban centers, it’s hard to find the place in public that isn’t covered by surveillance camera at this point.
[00:20:38] Will Duffield: Unlike surveillance cameras in other parts of the world where you see high density, they’re networked, which a big change. You hear that everywhere in central London is covered by CCTV as well, but it’s a bunch of private CCTV cameras that no one’s really querying or saving.
[00:21:00] Paul Matzko: The idea is as you jaywalk across the street, the camera’s catching her image, it scans your face against the database and can generate who you are just from your face. At least that’s the promise of this. Then you do all that shaming type stuff we’re talking about. Actually, they’re also rolling out car tracking technology using cameras, not unlike the drone episode that we talked about in the previous week where in Baltimore they’ve been trying out drone tracking from the skies that the police can follow any car in the city as it drives around the bus track. Criminals, bank, robbers, et cetera.
[00:21:39] Matthew Feeney: Not drones yet. One day maybe [crosstalk] In Baltimore, they were using Cessna airplanes. I’m not aware of a‐
[00:21:49] Will Duffield: They had a blimp and they lost it and it crashed into the [unintelligible 00:21:52].
[00:21:53] Paul Matzko: Yes, but what we were talking about the persistence surveillance systems [crosstalk]
[00:21:56] Will Duffield: Yes, just anything that you can take photos and then run back through the photos‐
[00:22:01] Paul Matzko: But they’re developing a similar system using closed caption TV cameras which would follow a person to person level or cars. The idea here, there have been a number of stories about individual criminals caught using this closed caption facial recognition system. There was a potato thief who stole $17,000 worth of potatoes as one does. I imagine Will, on the weekends, you’re not in your Kaczynski cabin out there stealing potatoes. He got caught going through a security checkpoint at a pop concert. The facial recognition software said this is a wanted man who was under a using fake documents. These stories are being highlighted by the government. There is something Orwellian about the idea of even police using smart glasses with cameras in them with facial recognition technology, just scanning crowds, looking for criminals, but there’s a bit of a mismatch between the official story of what this looks like and how it works and what is actually going on behind the scenes. When you look at the jaywalking stories, you dig down a little bit, it turns out the system’s not really automated. They are taking videos of everyone as they jaywalk, but then someone has to actually go and feed– the system can’t handle more than a few thousand faces being checked against the system. They’re actually doing it by hand on the back end. The camera’s capturing people’s faces, but then they’re by hand putting in batches of a few thousand people to scan against. It’s not like they’re catching them at the moment instantly identifying them, putting them up and shaming them right then and there. It can often be weeks or months and it’s not very efficient if you’re having to do it by hand. It kind of defeats the purpose.
[00:23:56] Will Duffield: I assume like most facial recognition systems we see in the world at the moment, you get a lot of false positives.
[00:24:05] Matthew Feeney: That’s an interesting point because false positives and false negatives are an important part of the efficiency, right in any facial recognition.
[00:24:11] Will Duffield: In here you’ve got police wearing glasses, highlighting certain people’s faces. Unlike a false positive somewhere else, here it instigates or might instigate an immediate foot chase.
[00:24:27] Matthew Feeney: Yes. I want to make sure that we were clear on that. A lot of the jaywalking shaming, from what I understand, is not a, even if it doesn’t identify you immediately, there will be some sort of detection that someone is jaywalking in, and it will have a live feed of wherever that intersection is. They might not identify you, but it’s still enough that it would deter some people. I don’t think it’s the case. Maybe I’ll have to be checked, but I don’t think it’s the case that in every region, like every camera with facial recognition capability in China is being outfitted with the same database. It’s not that hard to think okay well, in the center of this town, here are the 10,000 people most likely to be here at this time. You don’t need all 1.4 billion people in China, all of that data. I think it is being used in some parts of China much more ruthlessly than in others. There’s a big difference between using it to deter jaywalking and using it to track people’s religious practices and other things like that.
[00:25:30] Paul Matzko: You could imagine, let’s say in a particular region, the local party chapter wants– There’s 100 people who they’re particularly concerned about right now who have been participating in anti‐government activity. So they say, “Okay, we want to catch these people doing something wrong that we can shame them for. We’re going to have our database of a few 100 or a few 1,000 people. We’re going to make that accessible to these jaywalking cams, to other kinds of monitoring devices so that they can be flagged instantly and we can then bring them in for– That’s our legal fiction for imprisoning or harassing them.”
[00:26:10] Matthew Feeney: I’ve said in print before that my objection to facial recognition is this pervasive real‐time use all of law‐abiding citizens. If you have a facial recognition system that is used as an investigation tool and it is only populated with data related to people with outstanding warrants for violent crimes, then my objections reduce in number. It’s not clear to me that– Nothing like that is what we’re seeing in China, frankly. I don’t want to scare listeners that China is the future of American surveillance. We have very different political system and judicial system. It’s frightening nonetheless to see your fellow men treated in this creepy way.
[00:26:58] Paul Matzko: There was this interesting aspect of this where it’s not– Unlike the social credit system where the official party line is that there is this robust single score system that’s going to be developed within two years. Of course, the reality that that’s hype. The reality is that there are bits and pieces of a system they are disconcerting in their own right but we’re not anywhere close to a full single score social credit system. The same thing is true for facial recognition, which both the central government and local Party chapters and municipal governments have an interest in making people think that they might be being watched and surveilled and recognized, facial recognition at any given moment even if they’re not. It place their benefit to hype up, to exaggerate how far this technology has been developed and how much it’s actually being implemented.
[00:27:55] Matthew Feeney: This isn’t a new kind of observation. I think people who pay attention to surveillance have said for a long time that some of the scariest implications of widespread surveillance is the impact it has on law‐abiding people. In the wake of the Snowden revelations are very interesting studies done on Google searches, Wikipedia searches. Pew had an interesting survey on people’s behavior and it turns out that you don’t have to be an Islamic fundamentalist to be a little creeped out by what Snowden revealed. You might be a little less likely to Google certain things. Medical conditions, fetishes, religion, anything related to– Actually quite popular hobbies that have to do with gun making or anything to do with firearms. It’s not a surprise that law‐abiding people change their behavior in the wake of surveillance either. Surveillance revelations or not even revelations, just new policies like more cameras on the street. That’s the really worrying thing.
[00:28:57] Will Duffield: It’s not just people as well, but private firms that provide valuable services. Who Pay Pal will process payments for is to some extent contingent upon not just state regulation, but signal sent by the state as to what is approved and what is not. You can look to something like Operation Choke Point there, or more recent crackdowns on ASMR cameras post foster.
[00:29:24] Paul Matzko: There’s a sense of eventually what we’re describing here, and we talked about this in an earlier episode with Aaron. The idea Bentham’s 19‐century philosopher. His idea of the panopticon where you have you have a prison and in the middle, there is a watcher who can look into the cells of the prison. It doesn’t really matter if the watcher, in his panopticon, the watcher can see into all the cells all at once, but even the possibility of surveillance. He can’t literally watch all the cells himself and just like here the Chinese government can’t actually see and process and prosecute based on the information it’s receiving from 300 million CCTV’s all at once.
[00:30:07] Will Duffield: What if you can’t determine when you’re being surveilled and when you aren’t, you might as well be being surveilled all the time, as far as you know.
[00:30:16] Paul Matzko: Is that a camera in the corner of the corner of the street corner, is it watching? Is it not? I don’t know.
[00:30:22] Will Duffield: It’s safer to assume it is.
[00:30:24] Paul Matzko: Right. It’s safer to assume it is. There’s actually a, this is a just a funny tidbit. It turns out the Chinese government calls it’s facial image database, it calls it the sky net, which I thought, I don’t know if someone in China is a big Terminator fan or if they just didn’t get the reference.
[00:30:42] Will Duffield: If you know anyone named Sarah Connor, maybe they shouldn’t visit anytime soon.
[00:30:47] Paul Matzko: Maybe not go over. There is one little tech tip that I wanted to throw in here and it isn’t immediately applicable to like live camera facial recognition software. You could see how it might be down the road. At tech crunch disrupt, there was an anti facial recognition startup, it’s [unintelligible 00:31:06] called DID. The [unintelligible 00:31:11] , the hash id and ideas, you’re going to de‐identifying, dis‐identifying yourself, and what they do is they, it’s really about like social media images. You share an image on Facebook and what you don’t want is some company reading the image, recognizing it’s you, and then based on that activity you’re engaged in or based on information from that knowing more information about you than you want them to know. That they now know you went to the beach with your family on this date or et cetera. There’s lots of potential ways in which people don’t want those images online. Giving personal information about themselves out.
[00:31:53] Matthew Feeney: Well, this is very, it’s a bit of a catch 22 when you think about these anti surveillance methods and techniques. You can use something like what Paul’s just described. If there’s a whole niche, a fashion industry [crosstalk] There’s also glasses. There are garments that cloak you from thermal scanners. When you’re talking about online, you can use toll texting, you can use signal. The problem is, unfortunately, some people are going to think you look suspicious in virtue of doing this.
[00:32:30] Paul Matzko: Yes. You’re wearing dazzle paint. You look like a 80’s glam rock star.
[00:32:35] Matthew Feeney: That’s the worrying thing. I’m not against people using methods like this of course. I think it’s a regrettable fact that states will keep an eye on people who take evasive action when it comes to surveillance.
[00:32:47] Paul Matzko: Well, I think we should expect is like we’re talking about in hindsight. We should expect this to escalate. There will be a facial recognition, image recognition, arms race essentially.
[00:32:58] Matthew Feeney: When you think about that though, that’s I think easy for us to say. In the United States for the moment, and I think for the foreseeable future, even if CCTV is outfitted with facial recognition technology, you’re not going to get pulled over by the cops if they notice that you’re wearing a big hat or if you’re wearing– But the problem is that if‐
[00:33:21] Paul Matzko: If anti mask laws in many states [crosstalks]
[00:33:29] Matthew Feeney: The problem is that in parts of China where this is really been utilized a lot, that’s just not an option .That it’s not going to be good enough for when the Chinese police officer comes up to you to say, “Well, I objected this sort of thing, so I’m wearing a big hat and I don’t need you.” It’s just telling someone who lives in the community that, that is an option is laughable which is a shame.
[00:33:53] Paul Matzko: This is a good moment to turn to. We’ve been talking about two new kind of buckets of technology and how it can be used for surveillance and social control in China. Let’s talk a little bit about the people who are going to be affected by this in some of the worst ways for whom‐
[00:34:09] Will Duffield: Who are being affected by it. It’s not will be. It’s happening now as you listen to this podcast.
[00:34:16] Paul Matzko: Why don’t we actually start with talking, you mentioned Uygurs activists, Uyghurs dissidence, flush out. Who are the Uygurs? Why is this surveillance technology, social control technology particularly disconcerting to the Uyghur community?
[00:34:30] Matthew Feeney: Yes. The Uyghurs are a Muslim minority who live in western China. In an area that they would they refer to as East Turkistan, while the chinese refer to as, I’m going to butcher the pronunciation, but Xinjiang province, which translates basically to new frontier. It became a part of what you know as the people’s republic and the last century and it’s the Uyghurs are not ethnically Han Chinese. It’s a distinct a cultural group. If it’s distinct language. They are Turkic. They have a distinct cuisine, distinct language, distinct culture,and of course they’re Muslim. They have been the target of, I think it’s fair that they are the target of the most intensive and highly sophisticated surveillance regime in the world.
[00:35:16] Paul Matzko: The Chinese state, well, it’s not as popularly recognized and they don’t necessarily sell themselves this way. It’s an ethno state. It is a Han Chinese state. To be Chinese is to be Han Chinese.
[00:35:31] Matthew Feeney: The reason why, well, the Chinese will the government will make the argument that this intensive surveillance is necessary because, the last couple of years a weaker separatists have committed atrocities. 2014, some weaker attacks killed a couple dozen people. 2009, there was a lot of unrest in the region. A lot of that, I think, of course, is worrying but does not justify the extent of the surveillance we see. I think it’s important to outline exactly what we’re talking about here. We’re talking about Iris scanners, Wi Fi sniffers, mandatory ID, the scanning of phones at checkpoints.
[00:36:14] Will Duffield: Their QR codes outside people’s home that the police can scan to then determine who’s supposed to be in the home. It’s like a list of residence in QR form.
[00:36:27] Matthew Feeney: Yes. We have also shopping bags being x‐rayed. It’s horrible. At least one Wiga who made it to the United States is described, when he and his wife were detained, having DNA taken, having mandatory voice samples, and of course, mandatory facial scans. The impact this has had on the community is intense. That the UN estimates that probably around 1 million of these people are in what the Chinese government will call vocational training centers, but they’re concentration camps. I’m not using that term lightly. This is a mandatory propaganda brainwashing centers effectively. The people who are deemed eligible for this kind of reeducation is unfortunately down in large part to religiosity. That the more devout Muslim you are, the more that you profess your religion, the more likely you are to end up in one of these places.
[00:37:33] Will Duffield: It needn’t be political Islam, Islam Islam in any sense. It can simply be trying to live a halal life, trying to follow the edicts of your religion as they apply to you, and no one else. Not trying to push them on anyone else, just trying to be a good person as you understand it and that’ll land you in these reeducation centers.
[ 00:37:57] Matthew Feeney: I think it’s fair to say that this is the worst surveillance system we have on the planet, is what the North Koreans would use if they could afford it and had the technology. The Chinese certainly have the resources. The governor of the province used to run to that, which tells you all you need to know.
[00:38:18] Paul Matzko: Just cycle them around like the British Empire used to.
[00:38:22] Matthew Feeney: Yes. These parts of China are I suppose historically difficult to clamp down on and that’s in many different factors contribute to that. The ongoing worries that people around the world have with Islamic terror of course will provide constant excuses because the Uighur. The Chinese certainly have not just this horrible cocktail of this anti‐religion sentiment, but also this ability to conduct surveillance. As I said earlier in the podcast, I don’t want to make it sound as if what the Uyghur are being subjected to is what we should be ready for in the United States. We have a totally different political and judicial system. It is an example of how this technology can be used when there aren’t checks in place, and it’s very, very worrying.
[00:39:19] Will Duffield: You earlier mentioned as well the role that Chinese foreign investment plays in the somewhat muted response we’ve gotten with regard to this from the Muslim world particularly wealthier areas within.
[00:39:34] Matthew Feeney: Sure. We have a colleague, Mustapha, who works here at Cato who’s Turkish. I was speaking to him about this a while ago and he said it was really notable that the Turkish President and the Saudis and the Egyptians are pretty quiet about this crisis in large part because they don’t want to piss off the Chinese. Which is it’s easily the worst Muslim persecution that’s happening at the moment except from outside, and usually a lot of Muslim communities will be outspoken when they see persecution whether it’s in Chechnya or the former Yugoslavia but‐
[00:40:16] Will Duffield: Everyone has criticized the treatment of Muslims in Europe.
[00:40:20] Matthew Feeney: Yes, but the Chinese seem to be immune from that degree of criticism which is a shame because they deserve to be‐ the Chinese government deserves to be on the receiving end of severe criticism.
[00:40:33] Paul Matzko: We have ordinary consumers, we have journalists. Will, this is something you’ve done a little bit of research on.
[00:40:48] Will Duffield: We’ve seen a broader media crackdown this summer and fall particularly targeting foreign media or source at publishing capacity not controlled by the state. Twitch has been banned. Beyond the suppression of Uyghur, we’ve seen a broader crackdown on religion and religious media and expression in China this summer and fall while you certainly see the harshest conduct reserved for this Uyghur minority we’ve seen a crackdown on unlicensed that is ungoverned by the Chinese state churches, bulldozing churches, forcing churches to put pictures of Xi up on the walls and now a bill prohibiting the live streaming of sermons or religious gatherings. You often are seeing prohibitions on church or religious attendance by those under 18 it really cuts religion off at the knees as it were if you can’t inculcate or introduce your children to your religion, it’s pretty hard for it to last beyond your lifetime. We can understand both the Uyghur crackdown but also a broader media you know anti‐religious crackdown as part of a whole that seeks to cut out potential alternative sources of authority particularly political authority outside of the Communist Party and the Chinese state.
[00:42:32] Paul Matzko: There’s this vast network of Chinese house churches and who will spread say sermons like a recording of a sermon to‐ there’ll be a pretty good preacher, they can’t be in like an American style megachurch with seating for 20,000 people, they’re in cells essentially like a cell structure, in‐house churches and they’ll spread popular sermons among the different houses and that kind of thing is now being criminalized. It’s disconcerting, you consider that there are far more Christians in China than there are in the United States. Actually larger the entire population of the United States. It’s a very large community of people who you’re starting to see this government crackdown as well. Everyone’s eyeing that you have a digitally connected consumer base of somewhere between three and a quarter to a billion people who have smartphones, who use them, use services like Alibaba et cetera for almost all of their purchases and much of their daily life experience. People are‐ they’re ready to try to cash in and get capture part of that market, they put pressure on their governments not to speak out against these kind of abuses because at the end of the day, our few million Uyghurs in western China are worth the billions of dollars of potential foregone profit from a Chinese marketplace. There’s a calculation going on there for a lot of western companies not unlike Google who has the Project Dragonfly was leaked in the last couple of months where they’re planning on going back.
[00:44:13] Will Duffield: I don’t see how they can go through with [crosstalk] DOD contract.
[00:44:18] Matthew Feeney: We should explain what it is.
[00:44:20] Paul Matzko: Let’s explain the project Dragonfly. Back in 10 years ago from 2006 to 2010, Google cooperated with the Chinese government to censor their search database. If you put in Tiananmen Square nothing would come up or it would you would redirect you away from stuff that Chinese government decided was a risk to the regime. Google did once enter‐ they went against their first principles which do no evil, this is their corporate slogan, they violated that cooperate with an authoritarian government but by 2010 the pressure to stop doing so got so strong that they pulled out. For the last eight years, they haven’t. There is no Google search function in China. New leadership at Google has decided– look again this is a large market a very potentially profitable market we want a piece of. So, the documents were leaked. They’ve been secretly considering rolling out this project dragonfly, a deeply censored version of Google search in which essentially the party could say, “We don’t want these search terms to pop up. Or if they do pop up‐”
[00:45:28] Will Duffield: and tie‐
[00:45:29] Paul Matzko: — searches to individual phone numbers and addresses we can track the [cross talk]
[00:45:34] Will Duffield: Can be policed.
[00:45:36] Paul Matzko: So, again, we have a system in which, government voices, major companies around the globe aren’t willing to speak out against human rights abuses, against communities like the Uyghur just because it will cost too much. We don’t value their freedoms and their liberties as much as we value making money within Chinese supermarket place. So, that’s the situation that we’re in. I think that’s where we’ll leave off this week and until next week, be well.
[00:46:09] Announcement: Building Tomorrow is produced by Tess Terrible. If you enjoy our show, please rate, review, and subscribe to us on iTunes or wherever you get your podcasts. To learn about Building Tomorrow, or to discover other great podcasts, visit us on the web at libertarianism.org.