E27 -

We don’t know the consequences of the infinite scroll yet, but in order to actually live, it may be wise to be mindful of your screen time.

Paul Matzko
Tech & Innovation Editor
Aaron Ross Powell
Director and Editor

Aaron Ross Powell is Director and Editor of Lib​er​tar​i​an​ism​.org, a project of the Cato Institute. Lib​er​tar​i​an​ism​.org presents introductory material as well as new scholarship related to libertarian philosophy, theory, and history. He is also co‐​host of Libertarianism.org’s popular podcast, Free Thoughts. His writing has appeared in Liberty and The Cato Journal. He earned a JD from the University of Denver.

Will Duffield is a research assistant at Cato’s First Amendment Project.

If you have kids, you know how alluring smartphones and social media can be for a generation raised with (and, at times, seemingly by) the technology. But researchers are starting to worry that engagement with this technology is rewiring peoples’ brains by design. Tech companies have invented mechanisms like the “infinite scroll” and notification systems which trigger chemical releases in the brain, habituating users.

But as concerns about these effects grows, companies have responded with new ways of consumer self‐​regulation. Paul, Aaron, and Will talk about whether government ought to play a role in regulating these technologies or whether private action and education are sufficient. They also discuss “mindfulness” apps meant to ease meditation and promote healthier living.

How old were you when you got your first smart phone? How much screen time should toddlers and adolescents be allowed to have? What are feedback loops and how do they influence technology product designers? What is the role of engagement for apps that are ad‐​driven? Can an app lead to spiritual enlightenment?

Further Reading:



00:05 Paul Matzko: Welcome to “Building Tomorrow,” a show about how tech and innovation can make us all merry and bright. I hope you enjoy your Christmas, Hanukkah, Kwanzaa, or just plain old day off work with friends. But now, New Years is on the horizon, that time of sincere resolve to change your life next year for the better, and then never using that new gym membership ever again. In that spirit, we’re going to discuss the ways that digital tech have changed our routines, but even more significantly, even our basic brain chemistry in all kinds of unintended and often harmful ways. The too long, didn’t read version, your brain’s been hacked, and what we should consider then doing about that. As usual, I’m your host, Paul Matzko. With me are Will Duffield and Aaron Powell.

00:55 Paul Matzko: To start, let’s lay down the baseline. Okay, we’re talking about smartphone use, to get real simple. How old were you guys when you got your first smartphone? And to define that, let’s say a phone that could and you did use to go online. It’s not just like phone text.

01:14 Will Duffield: Mid‐​high school, I got a hand‐​me‐​down Blackberry from my uncle, which I guess was the first internet phone I had. I had Twitter before it, actually, on a dumb phone, and it was great ’cause it was one of the few ways you could get live news updates through SMS, ’cause you could follow like CNN Breaking News and set them…

01:34 Paul Matzko: They’d send you the…

01:35 Will Duffield: Yeah. But it was certainly when internet phones were pretty new because there weren’t any limits on tethering at that point. And at my high school, you couldn’t use the school network for all sorts of gaming, movies, whatever else. And I was pulling like seven, eight gigs a month through this tether connection on this Blackberry.

02:00 Paul Matzko: That was a lot back then.

02:01 Will Duffield: So, I really used it. My first smartphone was less a smartphone and more of a wireless modem.

02:07 Aaron Ross Powell: I will now distinguish myself in age from Will by answering, I think I was either 30 or 31.

02:14 Paul Matzko: Oh, wow, okay.

02:15 Aaron Ross Powell: It was shortly after finishing law school, moving out to DC to work at Cato, and I got… It was the T‐​Mobile G2. It was one of the early Android phones and it was terrible, but that was my first. So, I went through all of my childhood, and all of my college years, and all of my law school years without a smartphone. I did have an iPod Touch in law school, which I guess is kind of like a smartphone that only works when you’re in the building.

02:47 Paul Matzko: They had WiFi though, so you could… Yeah, yeah.

02:50 Will Duffield: You had WiFi and you could use the internet, but it was very interesting with… That first generation of smartphones, it wasn’t an app ecosystem yet. It was basically a miniature… It was a web browser that you could carry around. You could check your email, but beyond that, I don’t really remember what functionality I was making use of.

03:13 Paul Matzko: Well, and I think for, at least for me, college is when I had my first smartphone. But the stuff that we associate with smartphone use among, well, especially teenagers now, is stuff that you did on a computer, right? So, like AOL Instant Messenger. That was our version of texting.

03:33 Aaron Ross Powell: ICQ was what I used in high school.

03:36 Will Duffield: Really? And enough of your high school compadres were also on ICQ?

03:44 Aaron Ross Powell: My nerdy friends, yes. All of us fancied ourselves computer hacker sorts were on ICQ, and there was the competition about you’re… ‘Cause you’re… You didn’t have a user ID, you didn’t have a username in ICQ, you got a number and it was a sequential number. So, I was like this is the number of users who signed up. So, there was the… You were super cool if you had a low ICQ number. No idea what mine was.

04:09 Paul Matzko: So, I think what’s striking about this is that… So, some of the stuff we’re gonna be talking about here, we’re doing a giant experiment on an entire generation of American, really Gen Z, I suppose, who are…

04:24 Aaron Ross Powell: The founders is what they ask to be called. The recent MTV did a survey of what they’d like their generation to be named, and the winning was the founders.

04:33 Will Duffield: Founders of what? What did they found? [chuckle]

04:35 Aaron Ross Powell: It’s not clear.

04:36 Will Duffield: I don’t see.

04:37 Aaron Ross Powell: Yeah.

04:38 Paul Matzko: I don’t know, that has a little bit too much of, like new founding fathers vibe to me. Maybe I’ve just watched too many of those Purge movies.

04:49 Aaron Ross Powell: Yeah, I didn’t have a strong opinion of this generation.

04:51 Paul Matzko: That rubs me the wrong way a little bit.

04:52 Aaron Ross Powell: I didn’t have a strong opinion of this generation until I saw that, and I was like, “Maybe they could be worse than the millennials.”


04:58 Paul Matzko: Hey, hey, give us our avocado toasts, give us our… We’re doing all right there, you bitter Gen X representative here.

05:05 Will Duffield: Yeah, Gen Xers don’t have any… No viable presidential candidates really. They’re getting beat by a bunch of septuagenarians.

05:12 Aaron Ross Powell: We invented the stuff you guys think is awesome.

05:13 Will Duffield: You’re the missing generation, frankly.

05:15 Paul Matzko: There is a version, we could say it’s likely, but there’s a version of the future in which we just skip from the last baby boomer president, Donald Trump, or let alone Joe Biden or Bernie Sanders [chuckle], next cycle. From Donald Trump to a millennial.

05:29 Will Duffield: Well, we’re gonna get Alexandria Ocasio‐​Cortez. It’s only men that have to wait until 35.


05:36 Aaron Ross Powell: We should get back on topic because I can see Tess, our producer, in there is kind of…


05:40 Aaron Ross Powell: Really distraught.

05:42 Paul Matzko: Yeah, she’s a little aggressive. Okay. So we’re back on topic, but we’ve established that we are older than the folks who grew up using this technology. So for most of us it’s teen‐​aged, college, early 30s. Really the concern here is that there’s an entire generation who are from toddler‐​hood, even, already using smartphones, tablets, and we’re starting to question what the effects are on just kind of basic human psychology from using these devices at that young of an age. Some of the stuff…

06:20 Will Duffield: And I think particularly how they’re used at that age.

06:24 Paul Matzko: What do you mean by that, Will?

06:25 Will Duffield: Well, it’s one thing for a kid to watch TV shows on a tablet, it’s another for the tablet to be treated as a pacifier by the kid’s parents.

06:35 Aaron Ross Powell: It works wonderfully.

06:37 Paul Matzko: It does, I’m down with the pacifier.

06:38 Will Duffield: I’m sure it does, but the context of that use feels unhealthy. You’re supplanting some means of care with that Netflix tablet and…

06:53 Aaron Ross Powell: Sometimes when you’re really exhausted that’s the only thing left.

06:55 Will Duffield: I haven’t been a parent yet, it’s gonna be a trip.

06:58 Paul Matzko: Well, I’ll share an anecdote. My partner was out on a walk, saw a young mom clearly exhausted pushing a pram with like a one, one‐​and‐​a‐​half year old sitting in there holding her mom’s phone watching, I don’t know, Blue’s Clues or something. And the question was, who is self‐​soothing here? And the answer is they probably both are. Like you’re a tired parent, you just give them the device, and they’re quiet and content for an hour, and you need that.

07:33 Paul Matzko: But there’s all kinds of… When I call this a grand experiment, it’s ’cause we don’t know during a developmental stage what this will do to these kid’s patterns of thinking, patterns of even socialization. It’s one thing when you adopt this once your neural pathways are already relatively formed. Like, Aaron’s pretty much in concrete by the time he gets a phone; he’s 31, his brain is functioning the way it’s gonna function. And even in your 20s…

08:00 Will Duffield: No changing him now.

08:01 Paul Matzko: No, you can’t teach an old dog new smart phone tricks, I think the saying. But the reason why I think folks are concerned is because when you use these devices, it’s an active device, so it’s not just like watching a screen. But that activity has been designed by developers to encourage kind of basic primitive chemical releases in the brain.

08:27 Paul Matzko: So, when you do certain things on your phone, you get rewarded with a dopamine rush. Your body gives you a little exhilaration, like, “Hey, good job.” And a lot of the software is programmed to provide that, they’re actually consciously thinking these days. They have neurologists, they have folks with training in brain chemistry who are working with designers, “How can we get folks essentially addicted?” Maybe probably not call that addiction, but get them hooked on using the software and continuing to use it.

09:02 Aaron Ross Powell: What strikes me as interesting about that in this context is, it’s always been the case the product designers want to design products that are engaging. And so you want this to be a product that people are going to use a lot, because then they’re going to recommend it to friends, or they’re gonna want to buy the next version of it or whatever else. But there was always a distance between you as the product designer and kind of the use of the product. And the only way to… So A, you couldn’t really measure. If Wired Magazine prints out a print edition of their magazine and mails it to people, they can measure, are people re‐​subscribing? Are they cancelling their subscriptions? Are subscription rates growing? But they can’t really see how people are using this magazine. Maybe they can get a sense of sales for advertisers went up by X amount when this thing went out, but there’s this…

09:53 Paul Matzko: It’s pretty crude.

09:54 Aaron Ross Powell: It’s pretty crude. Whereas now you have… They can instantly… They can be measuring you in effectively real time as you’re using the app and it’s pinging back to the server.

10:05 Will Duffield: And it’s not just measuring. When you were to buy an egg beater, say. After buying it, the egg beater company wasn’t making any more or less money whether you used it regularly or not. But when it comes to Facebook or really anything else with an ad‐​driven model, every minute you’re spending there is revenue for the company.

10:28 Aaron Ross Powell: Right. But even if it’s a sales, like it’s a one‐​off sales of the app, they can still… They can be measuring this stuff, and then they can be tweaking this stuff. And then the other difference, too, is when you buy that egg beater, even if they later are like, “Oh, there’s better ways to make this egg beater.” And they release egg beater version 1.5, you’re not going to get that unless you decide to go out and buy another egg beater, and for a lot of products, we just don’t buy new ones until the old one breaks.

10:58 Aaron Ross Powell: But they can now push the updates out to you in either real time if it’s websites, or close to it if it’s over the air updates. And so you can create these feedback loops. But one of the things that gets lost then, or I guess one of the… I think the results of that is that when you are the product designer designing, a product designer has a lot of values they’re trying to bake into this product. And so there’s considerations like you want a product that works well, but also we want a product that has certain aesthetic qualities. We want a product that has a certain feel to use, we want a product that we could be proud of, these are all values that exist. But the feedback loop of testing plus instant optimization, I think in a lot of cases, crowds out all other considerations in designing the product, except engagement.

11:48 Paul Matzko: There’s, I think it’s from Tristan Harris who was a Apple engineer, Google manager, and actually was Google’s internal design ethicist was his title. Now, he’s left to basically complain about how terrible all these things he helped create are. But he… I’m pretty sure it was him, he gave the illustration of how on Instagram, they’ll track their users and they’ve noticed that certain users will… They respond favorably, they engage more when instead of releasing notifications of likes whenever they happen.

12:23 Paul Matzko: So, someone liked your Instagram post, and so you get, “Oh, that person like it” notification. “Oh, that person liked the notification.” They’ve noticed that some people respond better when they cluster the likes so that the dopamine rush is bigger when instead of one every five minutes, you get a batch of five in half an hour, and suddenly five people liked my post, and it triggers that dopamine release more powerfully. And they can do that on the fly and it’s not some literal engineer for each person calculating that, their algorithm, their artificial intelligence systems are doing that for you without any kind of human oversight, and they just say, “We wanna maximize these results, track individual people, figure out what gets them to engage more and more and more.”

13:12 Will Duffield: Is that wholly a dopamine chasing story? I mean, I noticed that on Twitter, when I’ve said something foolish and a bunch of people are dunking on me, I won’t get all of those notifications at once, I’ll get them in a staggered fashion. And for me, if I’m getting a couple hundred people chiming in, or liking, or whatever else, I don’t want a constant stream of notifications, I’d rather those be packaged and given to me every half hour rather than my phone just dinging constantly. So, I don’t know, I can see other reasons why you might want to do something like that, but certainly, in the way Harris presents it feels [14:00] ____.

14:00 Aaron Ross Powell: But I think one of the points or takeaways of this production cycle is that in the past, a product designer would have had to sit down and think about how this product might be used and make those kinds of considerations. Like say, do I think, and based on the evidence that I might have, which is gonna be a little bit rough, do I think that I’m gonna get more engagement by clustering versus dripping them out at a more even pace? And is engagement, is the kind of engagement, that creates the kind of engagement I want, is it gonna lead to long‐​term less use of the product? Or is it gonna hurt my brand reputation because people will feel addicted, whatever?

14:43 Aaron Ross Powell: But now, you don’t even need to make that decision. You can just be like, “Well, what I’m gonna do is put out two versions of the app, randomly assign people to buckets, and see what works,” and I’m gonna optimize on the measurables, because you can then, it’s nice to be able to go to your boss and say, “Look, all the measurables have gone up.” And it might be the case then too, that like Will, you seem to respond better when they’re clustered, but Paul, you seem to respond better when they’re spaced out, and so you guys are gonna get what you get.

15:12 Aaron Ross Powell: And from the perspective, this causes kind of… You can imagine someone who is not a libertarian looking at this, someone who doesn’t… Who sees the kind of natural inclination is to see things to be like, “Well, what role could the state play in this?” This seems almost like the market breaking in a certain interesting way. Because you’ve got people, product designers, following incentives. Their incentive is they want their product to stay in business. These are incredibly competitive markets, the advertising margins are razor thin, so every last bit of engagement is important. So, they’re simply following reasonable incentives.

15:51 Aaron Ross Powell: But the result in the aggregate is that our phones are filled with these things that have been laser focus‐​designed to just make us behave like the rat just hitting the thing to get the treat over and over again, which most of us think is on the whole not a good way to live, right?

16:11 Paul Matzko: Well, it’s causing burnout, right?

16:13 Aaron Ross Powell: It’s causing burn… And if we could step back and inject other values into the design process, we might say like we try to design different products, but each individual actor that you can’t be the actor who does that, because then all it means is that your app is just not gonna get used as much as the other ones and it’s gonna go under. And so is this the…

16:32 Paul Matzko: You’re not gonna get the funding, you’re not gonna boost your stock price, and yeah.

16:34 Aaron Ross Powell: Is this the kind of situation where what you need is an outside third party that has a different set of values to kind of impose and say, “No, you need to… We need to put a limit on this or you need to pursue other values.” Is that the only way out of this kind of vicious cycle?

16:53 Paul Matzko: That’d be a real twist for a [16:55] ____ podcast for us to be like, “Actually, we do need the state to step in.” No, no. It is a reasonable line of argument, you can get… It’s logically proceeding from some prior assumptions. I think as we look at what’s happening though, we see adjustment being made without state action, and we can talk about that some here. I mean, even Tristan Harris who I just mentioned, he’s part of an organization called The Center For Humane Technology. He’s joined by the guy who invented the infinite scroll Assa Raskin.

17:30 Paul Matzko: There’s a number of former engineers who are all saying, “Wait a second, we didn’t intend for it to be like this or to get this bad. Let’s roll this back.” Even Apple and Google, as we’ll talk about here in a bit, they’ve started to introduce measures to discourage people to use their products as much, functionally. We’ll talk about that. We’re seeing voluntary self‐​organized action. So, I think that’s the libertarian answer, is that we don’t need the state to do this. In fact, if they tried to do it, they’ll be super clumsy about it. All I have to do is watching congressional hearing of a Google CEO the other day to realize how ham fisted…

18:08 Aaron Ross Powell: You should check out Will’s Twitter feed during any of these hearings, ’cause it’s just him banging his head into the desk over and over again.

18:14 Will Duffield: You can come by the office and just hear me banging my head…

18:15 Paul Matzko: I hear it through my door. Yeah, yeah, I can hear it down the hall. So, there… I think we can rely on the market to correct its own failure here, but in thinking about, is this market failure, perhaps, but what I think of it, what’s interesting is the extent to which it’s just human failure in as much as we decided for a variety of interesting historically contingent happenstance reasons, we gave a bunch of 20‐​something‐​year‐​old, often college grads, college dropouts, they’re young, most of them who don’t have training in the humanities, in ethics, in philosophy, in why or whether you should do things. They just have training, they have the skillset to make stuff. And then we told them, “Make things people use a lot,” and they did that and didn’t appreciate the ramifications of it. So, take like the infinite scroll.

19:12 Will Duffield: Is that all that novel? I mean you look at the creation of the interstate highway system, or the Manhattan Project, or…

19:21 Paul Matzko: I don’t think they were that young, right?

19:22 Will Duffield: Plenty of them were, I don’t know.

19:26 Paul Matzko: But let me put it this way. So, I think Aaron mentioned this in conversation off the air, which was that what’s different is that the kind of adoption curve. So, let’s say you create a new interstate highway system. Yeah, there’s all these unintended knock‐​on effects for changing socialization.

19:41 Will Duffield: You’re designing for a very specific purpose without a whole lot of thought given to the broader second order effects of what you’re building.

19:50 Paul Matzko: Right. And that affects, first of all, hundreds of thousands of people, and then as it starts to re‐​organize the built structure of America, millions of people, now hundreds of millions people, so there’s a curve over the better course of a century of it transforming American society. You can change, you can create the infinite scroll, and within a year, billions of people around the world are…

20:11 Will Duffield: One generation or more.

20:12 Paul Matzko: Are using this. Right. So, there’s a exponential curve in adoption rates that’s going on that’s, I think, significant here. So, I think that makes a difference, that just sheer scale of adoption, the speed of it. And the fact that you have young folks who weren’t trained to think about these questions, making those decisions and experimenting. We’re all either part of the experiment or the control group. And that’s curious, I think that is different and interesting.

20:49 Will Duffield: So, what do we have so far in terms of hard data on the effects of this? Obviously, again, it is all still pretty new, but if you look at someone my age and I grew up with dial‐​up, got a smartphone, and here I am today in the app web 2.0 environment.

21:12 Paul Matzko: So, there are some… There’s a big longitudinal study and the science is always behind the tech. But there’s a big… I think Harvard is running the adolescent brain cognitive developmental study with 10,000 plus kids involved. So, they’re gonna track and they’re gonna track basically an entire generation of kids who are born into this tech. Unlike us, who got it mid‐​development or later and what the effects are gonna be over the next couple of decades. So I don’t think we’re gonna have the best data for a while yet, but the early results, there is a… There’s been some good pieces talking about the generational order effects. I know Aaron, you despise millennials but…

21:57 Aaron Ross Powell: I don’t despise them just… They’re doing the best they can.

22:03 Will Duffield: In the broken world we’ve been left.

22:04 Paul Matzko: Broken world we’ve been left. That millennials, but especially Gen Z, are having less… I guess you can call these upsides in a sense. They have less sex, about a third fewer sexual partners on average. They’re sexually active a year or two later on average in high school, they take fewer drugs, they have less criminal behavior by half than Gen X or Boomers. The flip side of that is they also have significantly higher depression rates, a higher suicide rate and this is harder to measure, but it kinda goes with the criminality and sex and drugs. They’re less… They might be less anti‐​authoritarian, which I think is interesting, I think through as a libertarian. So there’s almost in a anesthetizing effect with this technology.

22:54 Aaron Ross Powell: They don’t have rock and roll, either.

22:55 Paul Matzko: No, no. Rock and roll is dead.

22:56 Aaron Ross Powell: Rock and roll is dead. So the sex, drugs, and rock and roll is gone for an entire generation.

23:01 Aaron Ross Powell: I don’t know how people like that can turn out okay.


23:06 Will Duffield: How confident should we be that all of these effects are linked to tech? Couldn’t it just as likely be, say, the fact that they all lived through the great recession, watched their parents lose their jobs and maybe have and had less going out money as kids and became risk averse as a result of just watching that collapse. I think that’s as plausible as the idea that it’s all smartphone driven.

23:33 Paul Matzko: Tess just pointed out that Gen Z has Halsey. Does that mean something to you, Aaron?

23:37 Will Duffield: Yeah, Halsey is fun. Good driving music.

23:40 Aaron Ross Powell: I have no idea what that is. [laughter]

23:41 Paul Matzko: Me either.

23:42 Will Duffield: So she’s like a synth‐​pop princess.

23:47 Aaron Ross Powell: But I think we can all… We can bracket the… What kind of scientific data do we have? Is it too early to have much data, because we haven’t had people who have really been raised, immersed in this stuff from day one and also the causal questions, Will, that you raised of how much of this stuff is caused by the tech versus other environmental things or cultural things, or whatever else. But I think it’s the case that all of us who have a smartphone can recognize our own addiction to the smartphone.

24:21 Will Duffield: Oh yeah, totally. But I feel like that’s on me.

24:23 Aaron Ross Powell: And you can recognize like so I… I can see how, prior to having a smartphone or prior to having a smartphone that was good enough that I actually wanted to use it fairly regularly, that that early Android phone was not, that I have a harder time with sustained focus, say. I have a harder time just sitting down and reading a book for a couple of hours the way that I used to. It’s probably very good that I didn’t have a smartphone in college, in law school, because I probably would not have done as well. We can, the kind of subjective experience is both very real for most of us, and appears to be incredibly uniform among tech users.

25:13 Will Duffield: I still can’t help but think of it in terms of individual responsibility, in intentionality. And I could be off‐​base there but I can pick up my phone and I can bounce around on Twitter or just scroll through Facebook aimlessly. Or I can open the Amazon Kindle app and keep reading whatever novel I was into on my smartphone, and it’s up to me to choose which of those things I want to do. I just often don’t approach the use of technology with as much intentionality as I ought to and neither do other people but…

26:00 Paul Matzko: So let’s say…

26:00 Will Duffield: But I don’t see that as a tech issue.

26:02 Paul Matzko: If we put this, put this addiction question in a sense, ’cause we’re talking about like, we don’t blame addicts as much as we blame… There’s this sense in which it overwhelms your conscious decision‐​making process. And so we’re having an addiction conversation, you put that in another context.

26:20 Will Duffield: But I can’t read a novel on my crack pipe though.

26:22 Paul Matzko: Right. So, do we blame someone who uses drugs, for the decision to become addicted to opioids? Right? Well yeah, we do. There’s individual responsibility, but we also know that there’s all these structural factors and unintended consequences of essentially the FDA and the cartel of pharmaceutical companies pushing opioid use on people as well. So, it’s both a structural Pipeline issue and individual responsibility and raising awareness and there’s all these unintended ill effects. It’s obviously not as severe as opioid addiction, but if you think of it in that context, I think we don’t have to choose between, “Yeah, it’s your fault, well, for being obsessed with Twitter and dunking on… ”

27:08 Will Duffield: It is though, it is my fault. I ought to use technology better.

27:13 Aaron Ross Powell: But this is…

27:13 Will Duffield: I failed to do so.

27:15 Aaron Ross Powell: But you can ultimately the agency rest with you and we can say it’s not the tech’s fault, because the tech doesn’t have agency, it can’t make you pick it up, they can’t make you use it.

27:24 Will Duffield: I downloaded Twitter and made an account in the first place. I could have never started down that path.

27:29 Aaron Ross Powell: But it’s also the case that the environment in which we find ourselves can make good behavior, however we wanna define it. Or the kind of behavior that we would aspire to, easier or harder. And so…

27:47 Will Duffield: Yeah, but that’s not just the phone environment, that’s how I’ve architected my Saturday morning. If I don’t eat breakfast and don’t get out of bed and just lay there for two hours, two hours in, I am much more likely to just pick up the phone and scroll through it. If I get up and go for a run, and eat a good breakfast, the phone doesn’t have the same appeal.

28:06 Paul Matzko: So, let’s take some of own‐​ness off, if you will. You can rest easy now. So, just like I was blaming 20‐​something‐​year‐​old, under aware designers for doing things they didn’t understand the second order effects of. Well, that same thing is true of us. It’s not like we were aware that, “Oh, if I use this tech, it’s gonna really drastically change my daily experience and my brain chemistry.” Until really the last couple of years, we’re starting to wake up to the fact that, “Oh, every time I get a notification, there is a cortisol response, my body pumps a chemical into my brain, that it’s the fight or flight hormone.”

28:49 Paul Matzko: And the cortisol, so every time I get that notification, my body says, “Oh, you gotta do something.” And then, which is why you reflexively look at your phone when you get the notifications and the company structured this notifications to try to trigger either dopamine releases from good ones, cortisol triggers from bad ones, and that builds in a kind of base level constant routine anxiety. Now, when you made the decision to use any of those apps, you weren’t aware of any of that.

29:20 Will Duffield: Two points on that.

29:20 Paul Matzko: You didn’t have the knowledge you needed to make an educated decision in that regard, right?

29:24 Will Duffield: I think, one, I can use the app without getting the notifications all the time. You should turn the notifications off. And frankly, it’s rather amusing that with this panic about brain hacking and tech addiction. The one notification now you struggle to turn off on your phone is the one that pops up every day telling you how long you’ve been on your phone.

29:44 Paul Matzko: Yes. That’s information we have now or in the last year or two. 2016, no one’s talking about dopamine and cortisol, and the effects of infinite scroll and the effects of how… This is a new conversation.

29:57 Will Duffield: No. But you can look at how you’ve made use of some app and think about whether it’s healthy or not. If you’re on Tinder, say, and you’re matching with people but you aren’t really sending them messages, you’re just getting a little warm glow knowing that someone likes you because you don’t feel like going out, whatever. You can look at that and say I’m not using this in a very healthy fashion. I don’t think I need knowledge about how dopamine is rolling around my brain to draw that conclusion.

30:26 Aaron Ross Powell: But this brings up the other thing we talked about which is this and what you raised in response to the like, shouldn’t there be a law question about the tech companies starting to bake into… So, one of the things is you have to recognize part of realizing you have a problem. And so the…

30:44 Paul Matzko: Hi, my name is Paul.

30:45 Aaron Ross Powell: Android and iOS in recent updates have baked in applications that will tell you how long you’ve been using a given app, how long you use your phone, how often you pick it up, how often you unlock it, that kind of stuff. And even someone who’s subjectively aware of, “Oh, I’m on my phone a lot.” The first time you get those numbers, they’re presented to you like, “Today, you spent X number of hours on Twitter, and opened it up once every six minutes,” or whatever. It’s just, it’s like mind‐​blowing, and you just are embarrassed and hope no one sees those numbers. It’s the same reaction as when you used to hear these stats about how much the average American watched TV and you’d be like, “God, how can anyone possibly watch that many hours of TV?” And then it’s like, you’re doing three times that much Twitter every day.

31:40 Aaron Ross Powell: So, the tech is… I think before, it was… It’s harder to know and especially when you’re in an embedded environment. So, when I take the metro home or take the bus home from work each day, every single person, it’s rare to see someone who isn’t on their phone. So, you’re in an embedded environment, which is kind of acceptable or the fact that all of us during the half‐​an‐​hour we’ve been talking have checked our phones multiple times.

32:06 Paul Matzko: Mine is on airplane mode. I have checked the time and nothing else.

32:10 Aaron Ross Powell: It still is checking the phone. We’re so embedded in a culture that does this. And so then the question is, can the tech push us back out of this by giving us this kind of nudge of like, “Well, you’re using it too much,” or enabling me in the Ulysses asking his men to tie him to the mast of the ship. I know that I lack the will power to overcome this. So, what I want you to do Apple is lock me out of Twitter after 20 minutes each day.

32:42 Will Duffield: Oh, you’ve long had browser apps that will keep you from visiting certain sites while you’re trying to study. One very minor marginal design feature shift in the latest generation of Pixel phones that I like from this perspective is, it’ll give you an option to turn notifications off when you set the phone face down and there’s a nice physicality that accompanies that “Do Not Disturb” Functionally, it’s healthy to build in breaks. I mean, all these things are… Whether it’s your phone, your iWatch.

33:23 Paul Matzko: Apple. Obviously, I don’t have one. An Apple Watch that’s saying, “Hey, take a minute to breathe,” you’ll live 20 years longer or a thousand years longer, or whatever, if you do… Or it’s your app telling you, I think Tess, don’t you have a… Does your phone tell you to get off at 10:30 or something like that?

33:40 Tess Terrible: Hi guys, yes. My app tells me at 10:30, it’s time to go to bed and usually at that time, I’m scrolling through Instagram.

33:47 Paul Matzko: Did you set the 10:30?

33:49 Tess Terrible: Yeah, yeah. I have it set at 10 o’clock, it reminds me it’s almost bedtime. And at 10:30, my screen shuts down and it shows this little… What would you call it? The sand thing on.

34:02 Paul Matzko: Oh, yeah, yeah, yeah, the time, yeah.

34:03 Tess Terrible: The timer saying it’s time to go to bed and your screen time is done for the day.

34:09 Paul Matzko: When you get to 10:30, does it read the children’s story book, “Go the f*** to Sleep?”

34:14 Tess Terrible: I mean, I wish.

34:16 Will Duffield: Oh, oh, whoa, this is a family podcast.


34:20 Paul Matzko: Wasn’t thinking of that when I said that.

34:21 Tess Terrible: Is it though?

34:21 Paul Matzko: Yes, is it though? Is it? I did recommend diapers in our Thanksgiving [34:26] ____.

34:26 Tess Terrible: No, but you can get that on YouTube now. No, but it also gives me the option to extend my time on whatever app I’m using by 15 more minutes and then in 15 minutes, it tells me to go to sleep.

34:37 Paul Matzko: Like the opposite of the snooze function, like that?

34:40 Tess Terrible: Yeah.

34:40 Will Duffield: I think where it can get very interesting there is when the application will actually incentivize you with reference to your In‐​App goals to log off for a while. You think about something like the rested experience function in World of Warcraft. When you’re offline, it’ll build up a bar of double experience. And while you’re playing, that doesn’t build up. So, the idea is that don’t binge it, take a few hours off and frankly you’ll be benefited for doing so.

35:15 Paul Matzko: And you’re right. We’re building in break systems, a way of incentivizing nudging people into getting off their device, getting out of a program and spending some time IRL, in real life. And this is voluntary, it’s not requiring state action. I think that’s encouraging from a libertarian perspective, but this is a very old concept. And if I can take a moment to step back and put on my historian hat, so it’s history time, which is just like hammer time but with way worst pants… Better pants.

35:49 Paul Matzko: This was a question actually in literature. So, if you go all the way back to the 18th century, it used to be that writing didn’t have chapter divisions, that was actually somewhat unusual. So, you wouldn’t divide things into chapters, it was just one long litany of words. And one of the innovations in the early novel was we’re gonna break that up. We’re going to divide it into chapters which incentivizes, nudges people into taking a break. So, Henry Fielding who was an 18th century novelist said, “Those little spaces between our chapters are an inn or resting place where he may stop, and take a glass or any other refreshment as it pleases him.”

36:33 Paul Matzko: So this idea of this is as a tech is pre‐​digital, it is very old. The idea of when people are engaged with some kind of mode of entertainment, or any kind of engagement really, it is natural from a design perspective to build in breaks. And that in so doing, even though it discourages sheer quantity of engagement, it makes it more sustainable, and more enjoyable for the user.

37:03 Paul Matzko: So functionally, what Google and Apple, what all these companies are doing in response to these concerns, are implementing four century‐​old tech into our devices, which I think is kinda cool. It’s a very old conversation being applied in a new way.

37:17 Will Duffield: The whole, I think… The whole conversation is quite old outside of technology, it’s also difficult to live intentionally, to be mindful of how you’re interacting with others, making use of technologies, etcetera. It’s not localized to the smartphone, it’s hard to be a virtuous person.

37:39 Aaron Ross Powell: Your kind of literature structure as tech, it reminds me, it can go in the other direction, too. I’m reminded of the crime novelist Ed McLain, or Evan Hunter as it was his real name, that he wrote under that. And Ed McLain, but he talked once about how he would try to structure paragraphs to make the page easier to read, so that the reader just kept going, because the text… ‘Cause you encounter a giant block of text and it’s like a hurdle to overcome, you read it slower or you lose your place. So if you break things up in certain ways or you have the dialogue flow in certain ways, he figured out he’d get his readers.

38:23 Paul Matzko: Well, you’ve got the [38:25] ____ How would most crime novels… They still have chapters, how do they end the chapter? A cliff hanger, right? So, you want them to take a break, but you want them to come back. And I’m sure we could think of tech corollaries for your smartphone. Yes, Google might at 10:30 tell Tess to go the f to sleep. But next morning, it’s gonna have a bunch of notifications just pinging away, “Come back Tess, come back.” It makes that sound, I assume.

38:54 Tess Terrible: No, because my notifications are turned off, except for my kiddo notification. When you guys are texting me at 10 o’clock, “Where’s the podcast?” I’m like, “I don’t know.”


39:06 Paul Matzko: You’re so much better than… You’re so much better than me Tess. So, okay, so we have these responses to this problem. Design choices being made, maybe on another angle to take on this. Our program is specifically designed to help us be more, to use Will’s word here, mindful, mindfulness software. Am I using… Is it mindfulness, is that kind of the phrase? Yeah, there’s apps that help you meditate. I understand, Aaron, that there’s an app that promises to enlighten you at like Buddha‐​like levels within a day or two.

39:45 Aaron Ross Powell: Probably not. Such a promise would be hard to fulfill. But yeah, no, there’s… Then there’s the tech that is… So far, we’ve been talking about tech that addicts us, or tech that kind of makes us unmindful in the way we’re approaching things and then ways that the tech can kind of place stop gaps to prevent us from going down that road, but then there’s this whole line of tech that is supposed to kind of elevate our level of focus and our level of mindfulness.

40:11 Aaron Ross Powell: And so, yeah, you see, I mean, some of the most popular apps in the App Store are these guided meditation apps. Headspace is probably the most popular one. And there are… Lots and lots of people use them. They make a ton of money, executives in Silicon Valley encourage the employees to use these sort of things. My mini rant about them, which is why you teed this up, ’cause I have made this rant before, is guided meditation apps are… If your goal is the sort of mindfulness that meditation is supposed to cultivate. So, this generally Buddhist practice of mindfulness meditation focused on the breath. It’s called Vipassana. If that’s your goal, mindfulness, guided meditation apps like Headspace are just… Don’t bother.


41:03 Aaron Ross Powell: They’re not… Because these are techniques, this kind of focus thing is a technique that takes 90 seconds to learn. It’s very easy, it’s just focus on the breath, and when your mind drifts away, focus on the breath, and when it drifts away, focus on the breath. And so having someone explain it to you like a five‐​minute guided meditation, that just gets you, “Here’s how it kind of works,” is all you need. And then these apps are like while you’re there, here’s a 10‐​minute meditation on leadership. It’s like, “No, what you’re listening to is a soothing slow‐​paced Ted Talk on leadership while pretending to meditate.”


41:35 Paul Matzko: It’s like one of those things out there, like a movie from the 80s where they pop in the cassette and it’s like, “You are a strong and powerful person, be a leader today.”

41:44 Aaron Ross Powell: Yes. So, it’s affirmations or something, but it’s not, it’s not mindfulness meditation and it certainly is not helping you improve your mindfulness meditation because I think it’s actively interfering with it. So, if you’re serious about pursuing that, just get yourself a timer and sit there and focus on the breath.

42:00 Will Duffield: But there are ways that tech could be used to help, if not improve, at least gauge the efficacy of meditation.

42:10 Aaron Ross Powell: I mean all of this, yeah, so this is the kind of out there, the 10–20 years down the road of where does… Tech is a way not only to… Tech right now kind of interferes with our ability to focus and live mindfully, which is the theme that a lot of people pick up from all of this. Then you can see almost, you wanna call them Sci‐​fi stuff, but they’re not that far off, which is like if focus… Focus is just a mental state, being in a state of focus, of mindfulness, of awareness, of non‐​distraction or whatever you wanna call it, is simply a brain state, and we can measure brain states, then you can imagine all sorts of crazy like bio feedback.

42:52 Aaron Ross Powell: Think yourself into the state and the machine will kind of a buzz or give you some indicator when you’re there. So you know very clearly, “Oh, this is what that feels like.” And as you drift away, “Okay, that’s what’s drifting it away.” And as you drift towards, “Okay, that’s what… ” And you can imagine it, like radically accelerating our ability to cultivate these kinds of traits and mental states that…

43:13 Paul Matzko: Like a metal detector for enlightenment.


43:16 Paul Matzko: Beep, beep, beep, you’re getting close.


43:17 Aaron Ross Powell: Right. Or those things the Scientologist detect.

43:20 Paul Matzko: The E‐​meters?

43:20 Will Duffield: We don’t need to go there.


43:24 Aaron Ross Powell: But, yeah, so you could see, this is the interesting thing, is I think that when… It feels like the era we’re in, with this kind of technology, with smartphones, with apps, with social media, with addictive tech, feels incredibly pervasive right now. It’s everywhere, and it feels like this is just the way things are and it’s just gonna get worse because these people are gonna figure out even more sophisticated ways to get us using their stuff. But I suspect that in 10 or 20 years, this culture that we live in, of everybody on the bus staring into their smartphones all the time.

44:02 Aaron Ross Powell: And the moment you get on to the elevator, the first thing you do as you’re riding up three floors, because you’ve got that extra 20 seconds of time, is pull out your phone and reload the Washington Post or whatever, that’s going to look incredibly bizarre to people not too far into the future. And they’re just gonna be like, “What went wrong with these people?” I think that this problem will be solved.

44:26 Aaron Ross Powell: It’ll probably be solved both through cultural shifts, through awareness as we all suddenly become aware that this stuff is addictive in a way that we weren’t five years ago, or at least weren’t talking about. And as the underlying technology and it’s the people who are making this technology say, “Oh, I forgot. There are other values that matter,” and their values that matter not just to me as a human being, I don’t wanna be creating a world where everyone is addicted. But also as consumers start to say, “I don’t wanna buy tech that’s simply addicting me. I wanna buy tech… ” It’s like these are features worth paying for, to have tech that’s gonna help me out or it’s not gonna just not addict me, but maybe even take it further and help me in these regards. And so I think that this conversation to some extent, in 10 years, it’s gonna look kinda silly.

45:13 Paul Matzko: Yeah. Well, and we’re at kind of peak concern, we’re at that inflection point where folks are just on a mass level starting to wake up to, “Oh yes, I’ve noticed that when I get off social media, I feel happier and when I engage I feel miserable,” but that’s a real problem that we should think about collectively and do something about. We’re at that peak, we’ve started entering that peak confliction moment we’re gonna start to do that, we’re going to start to kind of solve the problem. It’s not… I’m actually reminded of a previous bit of a tech panic, I guess we could call it, which is the concerns in the ‘1950s and ‘60s over…

45:53 Aaron Ross Powell: About Pinball machines?

45:55 Paul Matzko: Well, [chuckle] that actually would that Pinball [45:57] ____ is gonna corrupt the whole generation, and so got you state power to stamp out the Pinball machines. No, I was thinking a marketing basically, Madman. The Madman, that modern advertising was gonna corrupt our minds, we were… They didn’t use word brainwash in this. Well, maybe… Anyways. We were basically being brainwashed by advertising, we were… Against our will. There’s nothing you can do. They’re so clever in how they can manipulate sound, and image and ideas to create captive audiences and we’re all mindlessly consuming and the modern ad man is the God of consumption.

46:33 Aaron Ross Powell: But tech to fixed that with they live glasses, and then you can see the advertising zombies and avoid them and you’re fine.


46:39 Paul Matzko: That’s right. Well, but it’s the same kind of thing. There was a moment of peak concern and functionally everyone said, “This is a problem. Maybe we should be more… ” We’re not worried about that to the same extent because we all kind of wised up as a society. We’re not as mindlessly following advertising like the folks were worried about in late 40s and early 50s. And I think the same thing could apply here. And just as we look back about those concerns, as being a little bit paranoid now, folks will do that down line like you’re saying Aron, I can imagine that. And maybe this should mitigate one last thought here, this is something you shared Will, was an article by John Richardson for the intelligence or the children of Ted. And this is kind of the extreme opposite reaction, we’re kind of saying, “Hey chill out, It’ll be okay.” But there’s another group who are kind of going in the opposite direction.

47:34 Will Duffield: Oh, they’re just neo Kaczynskists.

47:36 Paul Matzko: The Unabomber, Ted Kaczynski.

47:38 Will Duffield: Yeah. And they see the fruits of industrial society as being, if not universally at least, on balance negative or disastrous for humanity. I think there are some libertarian concerns there, though more broadly there’s a naturalism inherent to it that is often at cross purposes with the kind of human liberatory elements of libertarianism. You could almost do a whole episode on them…

48:17 Paul Matzko: We probably should, we should do that.

48:19 Will Duffield: In their own right. So, it’s not liberal stuff. When you look at the world you imagined coming about after the industrial society has been sabotaged and collapses, it’s red and tooth and claw and fairly fascistic in terms of how a survivor might go through their life.

48:40 Paul Matzko: Yeah. Well, it is something of that tune in, tune on drop… I’m trying to but I can’t remember the phrase.

48:48 Aaron Ross Powell: Turn on, tune in, drop out.

48:49 Paul Matzko: Leary’s phrase. There was a description of these Proto Kaczynskiate willing to use violence to tear down the industrial order, willing to kill people in order to do so on a massive scale, if possible, who they’re going back like… They’re camping in the woods, learning survival skills, learning to be blacksmith, forging their own hatchets. There’s this primitive return to nature component to it. I’m not sure that’s going to… It’s gonna remain on the fringe and I’m not sure that’s… I don’t think many of our listeners are saying, “I’m really annoyed with the cortisol response I’m getting from my Google Apps. I’m gonna go send mail bombs to people.” So, I’m not sure.

49:36 Will Duffield: Yeah, like, good luck. Industrial society is a hard nut to crack obviously, there’s some potential failure points, power grids, that kind of thing. But more broadly, I think they’ve got their work cut out for them.

49:54 Paul Matzko: That’s right.

49:54 Will Duffield: And thankfully so.

49:56 Paul Matzko: We don’t wanna talk about their New Year’s resolutions for 2019. I think on that note, we have somehow managed to go from… All the way from meditation apps to proto Kaczynski bombers, go figure, but that’s all we have time for today. So, until next week, don’t check your phone as much and be well.


50:21 Paul Matzko: Building tomorrow is produced by Tess Terrible. If you enjoy our show, please rate, review and subscribe to us on iTunes, or wherever you get your podcasts. To learn about building tomorrow or to discover other great podcasts, visit us on the web at lib​er​tar​i​an​ism​.org.