Complaints about the control that social media companies exercise over their platforms has risen over the past couple of years. Some users complain that platforms like Facebook and Twitter should remove more offensive content; others complain that they remove too much, resulting in censorship of politically unpopular views.
There is an alternative to this kludgy, top‐down model of content moderation. Minds.com is a more decentralized social network with a First Amendment standard for moderation that allows users to appeal bans to a jury of their digital peers. Minds also has a different revenue model from the big platforms, allowing users to boost posts with proprietary cryptocurrency payments. Our interview with Bill Ottman, founder and CEO of Minds, is a reminder that the future of the internet could look very different from the present.
What is the most successful social network? Who is innovating in the social media space? Why should you switch social media platforms? How does social media giants come between you and your audience?
00:04 Paul Matzko: Welcome to Building Tomorrow a show about what the future could look like if we embrace the freedom to innovate. This time of social distancing has highlighted just how important social media is. Imagine a time, not that long ago, when being confined at home would have been an even more isolating experience. No smartphones, no video conferencing, no digital substitute for real‐life interactions at all. But as our reliance on these platforms has increased so too as our awareness of just how they can surveil us, how much they know about us. Here at Building Tomorrow we’re not inherently opposed to targeted ads and other ways that social media platforms have for monetizing the data we create on and with them, but these companies do have immense potential social, economic, and political power, and they don’t always use that power in the service of free minds, and free markets. I can’t fault anyone who feels alienated by that surveillance. But since the Internet remains for now, one of the most free and open sectors of the American economy, there aren’t regulatory barriers preventing challengers to the big tech platforms from experimenting with different business models and different approaches to content moderation.
01:20 Paul Matzko: One such alternative is the social media platform Minds, which has tried to strike a better balance between privacy, moderation and decentralized authority. I’m joined by Aaron Powell, and it’s our pleasure to talk to Bill Ottman, the founder and CEO of the Minds social media platform. Welcome to the show Bill.
01:39 Bill Ottman: Hey, thanks for having me.
01:40 Paul Matzko: What does it mean that Minds is decentralized? How much control do you have over it?
01:46 Bill Ottman: So I like to say we’re decentralized with the D in parentheses. There’s very few projects that are fully decentralized. It is sort of impossible to be fully… There’s always gonna be centralization and decentralization sort of at tension with each other. But the components of our network that are decentralized are our payment system. So we use the Ethereum network for our token system and users can pay each other and subscribe to each other with, whether it’s Ethereum or Bitcoin or we have our own ERC20 token, which runs on the Ethereum network. And certain transactions can publish to the blockchain. So we have that, and then we also have a decentralized governance system, which is a jury system where basically if a user disagrees with a decision that we made, they can appeal it to a randomized selection of 12 active users and then those users vote on if they think that the content in question should be reversed. So that has been extremely successful, if you look at the content policies and some of the censorship issues on big networks, Facebook, YouTube, Twitter, there’s all this controversy around this.
03:23 Bill Ottman: So our policy is First Amendment based. So we really try to stay as neutral as possible with these issues, but there are always edge cases where we could make a mistake. And so we built this is system to sort of protect ourselves from ourselves. And that’s been awesome, people really appreciate it. And then we are an open source project, so anyone can actually take our code and make their own version of Minds. You can launch your own app with our code and then we’re working on some federation system so that the nodes can talk to each other. And then we’re also constantly looking at new solutions for decentralized content storage. There’s a really cool project, we’re building a prototype for now, which is called Arweave, which has built something that they called the permaweb where you can actually publish content, right now, it’s mostly just text and images, but it’ll be video in the future. And it cannot be deleted, ’cause you’re publishing to a blockchain‐like Distributed File System that cannot be censored. So we’re balancing, providing a user experience that is usable, which you need certain centralized functions for. So we do run central servers, but then we’re also kind of constantly trying to push the button. So I wouldn’t call us a fully decentralized social network, that’s just… That’s more of our goal.
04:58 Aaron Ross Powell: How much of this is effectively a return to the way things used to be? Because most of us today are used to this world of a centralized, single provider social media, Twitter controls its servers, Facebook controls its servers, and we basically by their permission use their servers and their services, but arguably the largest and most successful social network is Email, and that has been… Email is just a protocol, and anyone can write their own email server, and anyone can write their own email client. And the clients can talk to the servers, and the servers can talk to each other. And going back even further than the email we’re used to today, we had things like Usenet which was another just open protocol. Is that just kind of like what we’re headed back to is social media’s protocol?
05:54 Bill Ottman: I totally agree. Yeah, most people granted are not running their own email server. Most people are using email now similarly to how they’re using Facebook, which is like using Gmail. When the Internet first started, it was like, “Hey you wanna do email, set up a server.” [chuckle] And so I think that the challenge for the decentralization and kind of Internet freedom movement in general is to make that experience of hosting your own… Server as easy as possible, but at the same time, building out distributed systems like BitTorrent style systems or these sort of blockchain hybrid systems, it’s all about making it easy for people, because what Web 2 did was make everything so easy. You could just log into these sites and everything was done for you and it was all in the cloud and that was a super valuable evolution of the Internet. And so, we wanna maintain that ease of UX but also kind of ingrain the freedom within it, that we lost with Web 2 and that’s what Web 3 is really trying to accomplish.
07:17 Aaron Ross Powell: How do business models fit into this because obviously if you are Facebook and you control, all the users have to come to your platform, and use your servers, then your business model is pretty obvious. You can tell advertisers, “Hey, this is where everyone is. They can’t be anywhere else, really, if they wanna interact with their friends and family who are on this network. So I’m gonna let you sell ads to them, and it’s gonna show up on the platform and it’s gonna show up in all these ways. I control the clients, so you know exactly what it’s gonna look like, it’s gonna be uniform and so on.” But as we move to decentralization or platforms, it seems like that becomes a harder thing, that like, so you could imagine a scenario where Minds enables other people to write clients against it and suddenly those people are writing clients that circumvent whatever business model you have set up, and now, you’re basically running a public service as opposed to something that you can earn a living off of. So how do the business models work and then how it’s particularly, ’cause it’s interesting, like you mentioned, you used Ethereum or Ethereum network tokens and whatnot, how does crypto fit into that and does crypto solve some of those business model problems?
08:36 Bill Ottman: Yeah, so in terms of the competition with the public, you don’t even need to get into distributed systems, to sort of talk about that. We can look at a company like WordPress as a really important example of, because open source, for anyone who doesn’t know that just means that we share all of our software, anyone can take our code and do whatever they want with it as long as they credit us. And so, there’s many companies that took WordPress’s code and started competing against WordPress for website hosting and WordPress brilliantly encouraged that. And so, there’s sort of this counter‐intuitive thing going on where the old traditional mode of thinking would say, “I need to keep my secret secrets. I don’t wanna tell anyone the blueprints of my project, and that is what is going to give me the competitive advantage,” but that’s actually not what we’re seeing happening now. Now you need to create network effects and so you sort of need… There’s starting to become a pressure to share your code, both for the reasons of great network effects and for public transparency.
09:57 Bill Ottman: So you get the, WordPress is, 30% of the websites on the Internet are WordPress now, and that’s because they gave it away. So that’s one component of it and then in terms of… I don’t really think it’s either‐or, in terms of centralized systems and decentralized systems, I think that the crypto movement has a tendency to, I don’t… Maybe just over‐market in terms of decentralized everything. I think it’s a great goal and it’s really important for people to be thinking about that, but it’s really, that’s not even how systems work. Systems, we always have centralized components of any system. If you just look at network theory, you’re not just gonna ever have everything fully decentralized, that’s just not how it works. There’s always gonna be centralized clusters of activity or interests in networks. So I think that it’s about actually more of a healthy balance between centralization and decentralization because there’s so many incredibly valuable software systems that run on central servers that you don’t need to… We don’t need to demonize centralization, we just need to hold it accountable much more and make sure that when we’re interacting with the centralized apps, that we can see the code, that we can have a really clear understanding about what’s going on.
11:37 Paul Matzko: So Minds opened, I think 2015 is the date I’ve seen when it launches, but it’s not until a few years later that you add in the kind of Ethereum blockchain component. What changed for you as a result of that and was it necessary? As you pointed out, you don’t actually have… Clearly you don’t have to have the blockchain for Minds to work. What did that do, what did that add that you didn’t have before?
12:06 Bill Ottman: So we had this virtual currency system prior to moving to Ethereum and this was always one of our most popular features. You would earn points for all of your engagement on the network and contributions, and then one point, you could exchange for one view. So we sort of gamified attention a little bit, and then rewarded people with more exposure for their contributions and this was very valuable to people because the algorithms on Facebook are really suppressing everyone’s reach, you’re only reaching a few percent of your own followers on Facebook, ’cause their news feed is such a amalgamation of madness. So people are struggling to get their message out and so to have this place, “Hey, you can earn views,” essentially, people really enjoyed that. And so that became one of our most popular features and then once blockchain and Ethereum started maturing a little bit, it’s still very immature, so I’ll get to that. But we saw the value of, “Hey, we have these points on our servers, let’s take these points off of our servers and let people store the points on their own device. So now people can actually withdraw… Their… The tokens from Minds into their own wallets on their own device and they can go and play with them anywhere on the Internet.” And so that was super valuable.
13:37 Paul Matzko: Additionally, the whole token economy is much more transparent now in terms of the supply, in terms of the peer‐to‐peer activity. Now we do have a hybrid on‐chain off‐chain system, so we do handle some of the token… To be honest, most of the token activity is still happening off‐chain, but we encourage and actually reward people to bring it on‐chain. The issue is that in order to bring tokens on‐chain, you need to set up a wallet, we use a tool, a browser extension called MetaMask where you can pull the tokens off and when you use the tokens with MetaMask you have to pay what’s called a gas fee on the Ethereum network in order to do a transaction like say I wanna send you some Minds tokens on‐chain. I have to pay actually a tiny bit of Ethereum in order to send that to you as opposed to off‐chain where I just click a button.
14:38 Bill Ottman: And so we’re sort of in this position where we’re trying to educate people about how to engage with the blockchain, which are to be honest, quite slow and cumbersome and complicated granted I think Ethereum and MetaMask are doing a great job better than anywhere else but that… It’s… Creating a good UX in a very decentralized way is incredibly challenging and so, we’re just taking it one step at a time and experimenting with as many distributed systems as possible, and over time, trying to move our infrastructure to more distributed systems. And to be honest it’s even big tech understands that peer‐to‐peer distributed systems are valued. Netflix is closely looking at a tool called WebTorrent and other kinds of peer‐to‐peer bandwidth serving tools because ultimately if you can offload infrastructure to the community, there’s many… You could reduce cost, there’s more network resiliency, there’s all these different benefits of distributed systems that I think, and that’s why we’re seeing institutional adoption of blockchain as well.
15:58 Paul Matzko: To be clear for our listeners, the way that would function for Netflix, if I’m right, is rather than everyone downloading every piece of the TV series or movie that they’re watching on Netflix directly from Netflix’s hosted servers they would be torrenting pieces of it from other Netflix users who have already downloaded those pieces. How would that apply to Minds?
16:23 Bill Ottman: Yeah, so we host video and…
16:25 Paul Matzko: Oh, okay.
16:26 Bill Ottman: Yeah, so for… And video costs are huge in terms of transcoding and the bandwidth, it’s… Video is one of our biggest costs so as much as we… And we do use WebTorrent. So that has been extremely helpful. So I think that those are gonna keep maturing and we just have to put as much energy into them as we can.
17:03 Aaron Ross Powell: So we’ve talked a lot about interesting technical stuff and I’m all on board with moving in these directions and I understand the technical differences and get excited by platforms that function in these ways, but for the typical user who has been using Facebook or Twitter or Instagram or whatever and someone says, “Hey, there’s this thing called Minds.” Why should they switch particularly given that everyone is already on the platforms that they’re using?
17:34 Bill Ottman: Oh, yeah, so basically, it sort of depend… When I’m talking to people, it really depends who I’m talking to, like, how I position the pitch. If I’m talking to someone that I know cares about global issues, for instance, then I might say, “Hey listen.” It’s sort of like picking between a McDonalds and an organic market. If you care about privacy, if you care about transparency and these different digital rights then I don’t even say that you should leave. I just say that vote with your dollars, vote with your attention and support alternative platforms whether it’s a Firefox or Brave for the browser, installing Ubuntu or Debian on your operating system, using apps like Minds, and there’s other distributed social networks and open source projects out there that are alternatives to the sort of mainstream proprietary stuff.
18:46 Bill Ottman: I think that supplementing what you use is essential in order to grow in the movement. We have to… Where we… What we use everyday on our phones and our computers, that use is what dictates the power structure of the Internet. So tiny… Even just logging in one… A couple of times a week that is super impactful. So that’s one type of person. And there’s millions of people who want to do that and understand that that’s valuable. Now to the people who don’t care about privacy and just wanna do what’s convenient and… But they still don’t necessarily love Facebook… We’ve been focusing on monetization deeply. And so for the creator who is just sort of a mainstream creator who’s not into all the ethics and whatnot we are trying to pay people for their energy and for their traffic.
19:52 Bill Ottman: So on Facebook, you are nothing. On… YouTube actually has a decent program for creators, but we are trying to directly pay people in tokens and Fiat, yeah, we have a pretty robust dollar payout system, for people who drive us traffic, we’ll pay an RPM for page views that you can generate. We pay for all these varieties of activities. So for the creator, everybody’s worried about demonetization on YouTube. Now, with COVID everyone’s home, everyone’s trying to figure out how to create new revenue streams to be independently financially sustainable. That to me is huge for us. Because if we can help people make money, then we can make money, and we can kind of have the symbiotic relationship with our community. Then for the typical consumer, the typical person who’s just reading away on social media, that’s probably the most difficult one because we have a couple of million users, but we don’t have total critical mass, where every celebrity is with us. And so the popular, mainstream audience, that’s probably the most difficult pitch, particularly people who are not creators. But I think that we do have a thriving art and music community. We do have a lot of interesting content. And for this, everybody is sort of becoming a creator, is the thing. Now everyone’s a amateur photographer on Instagram or whatnot. And to be honest the Instagram algorithms are coming for you, they’re coming for you.
21:40 Bill Ottman: Everyone’s seeing their likes go down on Instagram and Twitter and YouTube and Facebook. And so I think that the human nature component that what we’re saying is, “We are not going to enact sketchy algorithms. We’re always gonna give you 100% reach.” Most people care about reaching their audience, and this is sort of the biggest crime of the big networks is the soft censorship that they’re doing in the newsfeed and this actually causes mass depression. There have been studies on this that show that the likes impacts people’s psychology. And so, when Facebook diminishes your reach and your likes, that is they know it’s causing depression. And what we’re saying is we’re never gonna get in‐between you and your audience. And I think that over time as we develop our UX and make it fully competitive, I think that that’s a really compelling value proposition. I think people are pissed that the big networks are taking away their reach.
22:47 Paul Matzko: Now when you talk about paying creators both obviously with tokens, but then also potentially with Fiat especially with the, in etherium transfer you can cash that out potentially. But so you’re not making money via ad revenue like Facebook or many of the other social media platforms and yet money’s going out in theory to creators. Where does your money come from? WordPress has a subscription kind of model for premium users. How are you making money to keep Minds up and running?
23:27 Bill Ottman: Sure. So we actually do have an ad network. Now, so we build the system that I mentioned before where you earn the tokens and then you can boost your posts for the tokens. It used to be one point is one view, now, one token is equal to a 1000 impressions. So we have an ad network, but the difference is that our ad network doesn’t spy on people in order to target the ad. So to us ads are a neutral technology. I mean, it’s a promotion tool. So you can have ethical advertising systems. Unfortunately the internet is full of spyware, which is these just gross ads, that they follow you around. You don’t know why, it’s sketchy. You see things that you just talked about, or just walked by. So we’re… We don’t do that and that causes it to be harder, that’s an example of it being harder for us to make money through that revenue stream. But that… So that’s a peripheral revenue stream that we’ll share with people.
24:36 Bill Ottman: But then we do have a premium subscription, so we have Minds Plus, and Minds Pro. Which are, Minds pros for full‐on professional creators where you can get actually your own custom domain, you can get more customization tools, more storage, more monetization options. And Minds Plus is sort of like you don’t have to see any ads or boosts and you do get some upgrades as well. So what we’re doing… Facebook and Twitter, they don’t offer subscription products. Everyone’s like, “Oh well, how do you make money?” We charge people money and then we share that revenue with the creator. So basically we take our revenue from plus and pro and take a percentage of that and share it with the creators who are helping us drive traffic.
25:31 Aaron Ross Powell: An issue that comes up in this and it’s part of the question of new social networks, and particularly ones that are I guess censorship proof is early adopters. And I think this ties into growing the overall audience too, because… And I’ve noticed this, I noticed this on Minds when I first checked it out years ago, it was, it’s been an issue on other ones as well, is the people who are the first to jump ship from existing social networks typically and particularly the people who are attracted to social networks where they know they can’t be censored, or at least are unlikely to be censored, are the kinds of people a lot of us don’t want to associate with, and then you end up in this issue where if I’m a new user and I come to this new social network and I see all of the biggest people on it are alt‐right folks, or conspiracy theorists or Nazis or just other people who got banned from Facebook and Twitter and elsewhere, I’m gonna say, “Well, maybe this isn’t the place for me but at the same time, if you’re committed to a free speech ethos, you don’t really have a way to limit those kinds of people. So how does Minds and just how does this kind of decentralized social networking or censorship proof social networking, deal with that, call them like deplorables first mover problem.
27:07 Bill Ottman: Yeah, that’s a great question. So our first… We’re a weird mix. We’re not… We do have some of that, but we have more balance than you would think once you actually do dig around into the content and you can find all of these incredible hyper‐realist artists and our biggest users, you’ll see people like Tim Pool or who’s a journalist, a sort of centrist journalist or you’ll see some Libertarian big YouTubers, some progressives, people like Lee Camp, Abby Martin, Joe Rogan, but you definitely do have that… I mean look if YouTube and Facebook are going to ban people, guess what, where are those people gonna go? They’re gonna go to the networks that will allow them to exist. And so we have… This has been a challenge for us, because a lot of times those people are really good at being loud and getting attention and finding their way into feeds, which we’re… Our hands are tied to a certain degree. And so we are trying to clarify our messaging, so that when people see something that they might disagree with that it’s… I think expectations go a really long way and if you can communicate to someone why they’re seeing what they’re seeing, so that they don’t come to the conclusion, “Oh Minds supports this content.” It completely changes the interaction.
29:03 Bill Ottman: So we just brought this guy on to our advisory board, a hero basically Daryl Davis who is a deradicalization expert, he’s a black man who deradicalized over 200 members of the KKK, he’s done TED Talks and whatnot, and he literally befriends extremists and that’s how they change. I mean people cannot deradicalize unless they engage with people who think differently from them. So this whole theory from big tech that censorship is diminishing hate speech is false. There’s no data to actually back it up. In fact all the data shows that censorship increases extremism. So we’re working on some messaging where people see, “Oh Minds is a neutral platform,” linking to some of this research, so wanna know why you might be seeing this content, this type of heads up to people, it’s a tough thing, because if you’re gonna give people recommendations, we don’t wanna get in the way, but we also don’t want people seeing what they don’t wanna see. So the challenge is making people see what they wanna see, without a censorship and that’s a partially technical challenge, it’s also something that we’re never gonna fully win. There’s always a chance that something’s gonna leak through.
30:37 Bill Ottman: And so I think that initial perception of, “Okay, I’m signing up here, I might see something that I disagree with, but guess what? The network is doing this for a reason and actually my seeing this is in a way helping deradicalize the internet as a whole. Because Minds is acting as sort of a pressure release valve and other networks are sort of increasing the pressure on the Internet, which can cause people to become violent. Actually it’s, some of the research shows that the censorship directly causes violence, because people think that they’re some sort of a victim, it totally inflames them, they go off on some kind of rampage. Now granted it works on both sides. Extremism can metastasize in echo chambers as well so it works both ways, but overall we see the evidence showing that open networks are more healthy and I think that you have the ability to block what you don’t wanna see. And we do have a pretty robust NSFW filtering system.
32:00 Bill Ottman: So if we see things that are overtly, if there’s a racist post or whatnot, that’s gonna get, that’s going under an NSFW filter. And some people wanna see that kind of stuff, some people don’t. We’re not gonna delete it from the network, but we’re also gonna do the best to for a new user it’s, we’re not gonna show NSFW content by default. So that’s one of the ways that we battle it but ultimately I think we’re gonna be able to drown it out as we grow and just get more big influencers on and whatnot. And overall, it is actually a small percentage of the content on the web, but like you said, because we’re positioned in this way, from our early stages, it does sort of pose some challenges.
32:56 Paul Matzko: So I was gonna ask… So this is a non‐unique problem for Minds. All social media networks have had to confront something similar. I think of Facebook, early on, it’s a kind of social network for pervy college bros to rate the attractiveness of girls. So their early adopters, if you will, were distasteful skivvy dudes, you know? And, but they pivoted to build a broader audience and they also had greater centralization, so the pivot was easier because of that, but the point being it’s non‐unique. I do wonder though, all social media networks have had to evolve towards greater content moderation or more formalized, I should say, more formalized content moderation systems. Facebook has set up something like, kind of like a court system with a… It’s all bench trials, if you will make the corollary to our legal system rather than a jury trial approach, which you have literally have at Minds, a kind of jury of your anonymously selected peers. Explain, maybe walk us through what was your thought process as you developed that content moderation approach, which I’m sure it wasn’t the same now, isn’t the same now as it was in 2015, but then also what does that look like on the ground? Maybe take some sort of hypothetical example, what does that process look like?
34:31 Bill Ottman: Sure, so Facebook rolling out their “court system”, to be honest, it’s like a marketing ploy, because at the end of the day, what matters is what is the content policy that the court system is upholding? So our content policy is a first amendment‐based policy, we’re not gonna, we’re not smarter than the Constitution, so we don’t think we’re gonna get that better. Facebook, if something even looks like, even historical art with nude imagery gets banned from Facebook, which has immense value to the general public, so they’re just off on some ridiculous path of who knows why they think that that makes sense, but I mean we want to eventually move our jury system into the initial decision. So right now, we do have a moderation team who goes through a queue of all the reports and makes an initial decision, the vast majority of the time that no one has a problem with it, and in the instances that a user feels like a wrong decision has been made about their content, they appeal it. So an example, it could be anything, it could be something that is an insult, it could be a nude image, it could be a model, it could be anything. It’s, the internet is a crazy place. [chuckle]
36:12 Paul Matzko: Yeah. And then that…
36:13 Bill Ottman: There is… Yeah.
36:14 Paul Matzko: That alienates someone… I mean someone is upset with that and reports it, I take it?
36:20 Bill Ottman: Yeah, yeah, we have a reporting system and then that comes through a queue and depending on… And basically, if you post something… So we have this NSFW system. When you make an upload and you’re uploading, say you’re uploading a nude model. Say you’re a photographer for Playboy. We’re okay with that, but when you upload it, you should tag it and say that this is a nude image, so that that image isn’t getting into somebody’s feed who doesn’t wanna see that. If you don’t tag your posts appropriately, then we will give strikes. Now, even if you get three strikes for not tagging NSFW content, we’re not gonna ban you, but we’re gonna flag your whole channel as NSFW because you’re just not even respecting our own tagging system that we specifically built for you. Guess what, we understand that you wanna put your image in front of people who don’t wanna see it, but that’s not cool. So that’s not really… There’s a whole world of people who, to be honest, I even have my NSFW on, like I don’t really, I don’t care if… But most people when they sign up, they don’t wanna just see that stuff without opting in, so that’s how that works. And then, yeah, we have a strike system and we’re trying to evolve it.
37:44 Bill Ottman: One thing that I should bring up, so this, in terms of misinformation, extreme content, all of that stuff, what we’re working on right now is a system that we call T3 for Trust Tree Traversal. So it’s sort of like a decentralized Web of Trust. So in China, they have a centralized social credit system where everybody, based on their actions is getting the score. What we’re doing is the opposite of that, but still taking a… The only thing of slight value with a score is that… So I have a group of friends and my interests are different from your interests, and everyone has their own network. And so, when I look at a piece of content or at a channel, I’m going to see a score on that channel, but that’s not a centralized score. So say I look at your channel and someone else looks at your channel. Your score to me is gonna be different from your score to the other person based on our network. So if I’m subscribed to all kinds of PhDs, and I have a good network in place, a trustworthy network, then when I come upon certain content or channels, it is going to have a certain score which is sort of related to my network, and how they interpret that content.
39:18 Bill Ottman: So, if my network has all up voted a certain piece of content, then that content is gonna have a really high score for me. Now, this is really important, because whether it’s misinformation, it’s like we have a way of scoring in a peer to peer fashion. We haven’t rolled this out yet but we’re working on this now because misinformation is a huge issue, and, but we also, just ’cause something is a lie, doesn’t mean that or is wrong, doesn’t mean it should be censored, but I think we do need a way of sort of deciphering trustworthiness but not in a centralized way. So I don’t know if that makes sense, but that’s a very complex problem that we’re working really hard on right now.
40:02 Paul Matzko: I did those, there was on your website, there was a ticker at the bottom which keeps a running tally in the last 30 days, how many actions were taken and when I checked, it was just under 6000 actions. There have been 13 appeals of those actions and four actions overturned. I’m not exactly sure what I expected, but there is a big disparity between those numbers. Is that normal? And so let’s say we have a situation where someone posted a piece of content, someone else found it offensive reported it, and action was taken by the moderators, it then goes to appeal. What does the appeal look like? How are the jury members selected? And is that about… Is that a pretty normal ratio for, in terms of outcomes?
40:54 Bill Ottman: Yeah, if you look at that. So what that says is only… If 13 out of 6000 actions were even appealed, meaning that our moderation team is doing a pretty good job, and then of those appeals, so only four out of 6000 actions were supposed mistakes on our end, according to the jury. So how it works is we just take 12 randomized active users who are not connected directly to the person who is at hand. So it’s kind of increase the objectivity a little bit, just to make sure it’s not one of their friends who’s voting on them. And yeah, then there’s just a… They have a prompt to either accept the appeal or reject the appeal. And we make it clear that, “Listen, you’re not voting on your personal opinion of if you think this is good or bad. You are voting in line with our terms which are in line with the First Amendment and, but also these are the terms for NSFW content, and so if it’s a picture of a nude person then… And we decided to mark it NSFW, then guess what? You really should be voting to mark it NSFW. You shouldn’t be voting to overturn that yet.
42:30 Paul Matzko: So there’s been a lot of chatter in tech policy, in the tech policy space for last couple of years, about tweaking Section 230 of the Communications Decency Act which shields platforms and internet companies from civil and criminal liability for… Well, it shields them from civil liability, they’re still obviously on the, a criminal, if they’re violating pornography, child pornography laws or sex trafficking laws, then it doesn’t shield them from criminal liability but from civil liability. So there have been a wave of challenges to that kind of liability protection from four platforms like Mind, like Facebook, like the others. There have been holes punctured in it, [43:22] ____. How concerned are you about challenges to section 230 and the implications for Mind specifically? If we saw Section 230 liability kind of completely overturned?
43:41 Bill Ottman: I have a hard time believing it would be completely overturned. The issue with Google and Facebook and all of the big networks is that they are, through their proprietary algorithms, through their favouring of certain types of content, they are basically acting as publishers. So when, they’re going to favor certain, whether it’s a certain political ideology or whatever kinds of content, we don’t know what their algorithms are actually doing, but they clearly are favoring certain types of content. And that… I think it’s a reasonable argument that they shouldn’t necessarily receive all the protections. So… But networks who are operating in neutral fashion, I don’t think that that immunity is likely to be struck across the board.
44:56 Paul Matzko: On the topic of censorship and you’ve talked about this earlier, in particular with the COVID-19 disinformation problem on social media and different platforms have attempted to address COVID-19 truthism [45:10] ____ in different ways. Putting a disclaimer next to flagged posts, others have pulled down accounts entirely. How is Mind attempting to address disinformation in a time of viral pandemic? And do you see issues of reporting bias, media bias in how platforms are responding on their own sites?
45:36 Bill Ottman: Yeah. Again, they’re acting as publishers. So they’re taking a clear stance on certain types of content. Even just covering COVID on YouTube is causing full out demonetization if not deletion. So that’s incredible that just across the board, that they would dis‐incentivize coverage of that. Now I do think that provide… I think disclaimers are good. I think that that… Enabling the user to become educated about the full scope of the discussion around COVID I think that that is the most important thing.
46:35 Bill Ottman: So let’s look at… All of the papers, all of the different opinions and put people in a position to educate themselves. We’re in no position to be telling people what is the reality of COVID-19. We’re developers and we’re running a social network. Why would we be telling people what is true or not with regards to that so and I don’t think that Google and Facebook should be doing that either. I think that they should be providing people with the scope of differing opinions on it, and yeah, a lot of people have their interpretation on what is the truth. It’s not necessarily about the pandemic, it’s about any issue. There’s immense debate from legitimate thinkers on both sides of so many different topics and this is when politics starts bleeding into it and identity politics and it’s pretty clear actually what Facebook and Google think about certain topics. And I think that’s really scary. So, again, teaching people how to research is more our prerogative than deleting certain misinformation. I think that misinformation is dangerous, but probably more dangerous is the unexpected consequences of just having algorithms powered by proprietary AI just blanket banning whole libraries of content. That is truly dystopian.
48:27 Paul Matzko: Now, you’re also the CIO of Subverse. Explain what Subverse is, it seems to… The decentralized future of journalism should have a role I think in this conversation. As you wear those two hats at Minds and at Subverse how do you see those two platforms interacting or kind of working towards a common cause?
48:51 Bill Ottman: Sure, so yeah, I’m not really involved with editorial at Subverse. I’m more involved sort of on the board and on the tech end of things, but I think that what Subverse is doing in terms of building a distributed team of journalists all over the world on the ground being able to get footage from all different kinds of global events is super powerful and trying to kinda bring objectivity back into journalism, the way… I don’t know if you’ve watched any of Subverse’s content but it really is straight down the middle like almost to a certain point of being like too down the middle, which people, some people don’t like because it can come off as just not having that kind of flair that everyone is thirsty for with polarized sensational media. But I think that… We believe that there’s a huge need for just footage, even just footage. Like, so that you’re making the decision for yourself when you’re watching the media or… But just balanced coverage.
50:16 Bill Ottman: And so building a network with journalists all over the world, being able to submit content and… Yeah, again, people make their own decisions based on the content. I think that that’s what everyone wants, and I think that we can… You can still have personality with balanced coverage but it’s just really sad the state of the media and how nearly every news organization, you can kind of just say right off the bat what side of the aisle they’re on. That’s… It’s pretty unbelievable.
50:57 Paul Matzko: So one of the things that… I’ve been at… To a couple of conferences and there’s a common lament among journalists and not misplaced one that the old financial model that supported so much investigative journalism, the stuff that we think of when we think of journalism as the fourth estate, the kind of higher order less profitable forms of journalism was subsidized by personal ads, advertisements and yeah, your print newspapers back in the day was cross‐subsidized but now the Internet, Craigslist, new media has undermined that old financial model. We’ve seen traditional journalism as a self‐sustaining concern implode. The number of journalists employed has fallen dramatically. Can Subverse and this decentralized model of journalism, how does it solve that kind of funding model? Like, journalists need to be employed, they need to be paid in order to do the work they do. Can something like Subverse replace what we’ve been losing on the journalism scene? And where does Subverse play in the kinda future of decentralized journalism?
52:20 Bill Ottman: Yeah, so one key component actually of both Minds and Subverse is that both are community‐owned. So both companies did equity crowdfunding rounds and what that means is that actually thousands of users of Minds and thousands of supporters of Subverse actually… Well, Minds, they’re now, they’ve been converted into actual preferred stock. So Minds has 1500 community members who own stock in the company. And this is… We used a platform called Wefunder to enable this. And this was through the JOBS Act, which created a new type of funding called regulation crowdfunding where both accredited and non‐accredited investors can invest. It used to be for startups, you could only get accredited investors, you couldn’t just go to the general public.
52:36 Bill Ottman: And then Subverse now had around 3000 supporters invest and those are under what’s called a Simple Agreement for Future Equity. So upon the next funding round those agreements will convert into stocks, and so that’s a huge differentiator in terms of funding models. It’s not like some big singular VC, it is the community. And then with that on top of more of the subscription model, which to be honest, some mainstream media are starting to do pretty well, and that is far more sustainable than what has been happening over the last decade with literally the most disgusting spyware‐filled news pages. You get in there and there’s a million little ad blocks all over the place. And the irony is that half the time it’ll be some investigation of surveillance, or maybe a piece about Edward Snowden. And it’s just like, the page is just all over you. I’m almost positive I experienced that on Vice a couple of times where the page wasn’t HTTPS, it was full of spyware and third party commenting tools, and it was like, some piece about… You know, ’cause they have great coverage of Snowden.
55:00 Paul Matzko: Yeah. [chuckle]
55:00 Bill Ottman: But then the site itself, is doing the very thing that Snowden is railing against. And people are doing this because they have to survive, so it’s not that they even want to do it, it’s that we really need to push the envelope with these direct‐to‐supporter models of funding. And so, yeah, that’s what we’re going for. We’re not gonna be reliant on that kind of revenue.
55:30 Paul Matzko: Last question, so you mentioned that there were some initiatives, some new features you were planning on rolling out at Minds, so for the upcoming year, what’s one of those features that you’re most excited about?
55:44 Bill Ottman: I would say… So, one thing that anyone who’s listening, what I recommend doing is going to Minds.com/canary, C-A-N-A-R-Y. That’s sort of our experimental mode so you can have the latest version of the app. We are probably most excited for the decentralization features. I’m really excited to be able to post content more directly to distributed systems like Arweave and we’re talking to some other potential partners so that that content is fully permanent. I’m very excited for that. And then, definitely the monetization features in this radical revenue sharing that we’re doing. We’re basically gonna be allowing creators to submit content to our premium feed and earn a revenue share on that. Imagine Netflix, but anybody could submit to it and Netflix would pay out the community. That’s one thing that we’re working really hard on.
56:51 Bill Ottman: And then, oh gosh, there’s so much that’s getting upgraded. We’re doing end‐to‐end… Our messenger’s currently encrypted, but it’s not fully end‐to‐end so we’re rolling that out and the Messenger is gonna be undergoing a total overhaul. But yeah, just the monetization. I think that people, right now, are all looking to be independently sustainable financially. I know for me personally, being an independent entrepreneur, that changed my life. Being able to work from home, work on my company, there’s so many people who that’s so far from their reality, and for us to be able to provide tools so that people, whether they’re a musician, or a journalist, or whatever they are to build an audience, and gain a following, and earn money for doing what they’re doing anyway on social media, that is huge for us. We actually just did our first payout to our pro users in March, and that was huge to finally feel like we’re starting to support creators. And so we have these different revenue streams that we’re rolling out. So between that and then just hurdling down the path of decentralization and anti‐censorship, I would say those two things.
58:27 Paul Matzko: Well it sounds like you got a busy year ahead of you. Thanks so much for taking the time to talk with us.
58:31 Bill Ottman: Thanks guys, really enjoyed it.
58:35 Paul Matzko: Who knows whether Minds will be able to knock off the dominant social media platforms, but even if it never grows beyond its current audience numbers, it’s a success simply as a proof of concept. A more decentralized, web 3.0, is possible. Monetization, other than through targeted ads, is possible. A democratic content moderation system is possible. Whatever platform eventually, and perhaps inevitably, displaces the likes of Facebook, Instagram and Twitter, it will be able to learn from the example of Minds. Alternately, the current incumbents, they might borrow some of these ideas for themselves, but either way, there is no great stagnation in social media. Until next time, be well.
59:22 Paul Matzko: This episode of Building Tomorrow, was produced by Landry Ayers, for Libertarianism.org. If you’d like to learn more about Libertarianism, check out our online encyclopedia or subscribe to one of our half dozen podcasts.