Building Tomorrow explores the ways technology, innovation, and entrepreneurship are creating a freer, wealthier, and more peaceful world.

Oct 26, 2018

Protecting Data Privacy Without Destroying the Internet

Protecting internet data privacy without hindering innovation requires a dose of legislative humility & a strong trust in consumer intelligence.

Data Privacy Internet

Recent data breaches by Google and Facebook have commentators glibly comparing the regulation of the internet in the U.S. to the “wild west,” as if there are no laws on data privacy and data protection.1 Some are calling for adoption of heavy-handed, European-style controls such the General Data Protection Regulation (GDPR), which imposes 45 specific rules on data-driven enterprises. They have applauded new data regulation rules in California, which grants sweeping power to the state’s attorney general to collect fees, impose rules, approve business plans, and solicit public support for class actions against internet companies. It is reasonable to be skeptical of the notion that increasing government power is the key to protecting privacy, but without federal preemption, the nation could balkanize with 50 sets of online privacy rules, undermining the seamless digital experience consumers enjoy today as well as the internet economy which powers some 10 percent of national gross domestic product.

The regulatory approach to data privacy & protection of the internet is wrong 

One reason that people believe that the US has an inferior, laissez faire approach to internet regulation is that they confuse data privacy and protection and because they are not familiar with America’s own substantive privacy protections developed since the founding. In fact, there are literally hundreds of laws on privacy and data protection in the U.S.—including common law torts, criminal laws, evidentiary privileges, federal statues, and state laws.2 America’s tradition of protecting privacy is predicated on ensuring the individual’s freedom from government intrusion and pushing back the overreach of the administrative state. By way of comparison, the EU’s laws are relatively new, officially dating from this century, and still lack the runway of judicial scrutiny and case law that characterizes U.S. law.

This experience from Europe gives us a glimpse of what to expect should we adopt a similarly heavy-handed regulatory approach in the USA. Simply put, the EU’s laws don’t work to create trust in the online ecosystem. After a decade of data protection regulation—in which Europeans have endured intrusive pop-ups and disclosures on every digital property they visit—Europeans report no greater sense of trust online.3 As of 2017, only 22 percent of Europeans shop outside their own country (a paltry increase of 10% in a decade). Moreover, only 20 percent of EU companies are highly digitized.4 Small to medium sized European companies have neither modernized their operations nor marketed to other EU countries5 because data protection compliance costs are too high.

To do business in the EU and comply with the new rules, US firms with 500 employees or more will likely have to spend between $1 and $10 million each to  comply with GDPR.6 With over 19,000 firms of 500 employees or more in the US, total GDPR compliance costs for U.S. firms alone could reach $150 billion,7 twice what the U.S. spends on network investment8 and one-third of annual ecommerce revenue in the U.S.9 Unsurprisingly, thousands of online entities, both in the EU and abroad, have proactively shuttered their European operations for fear of getting caught in the regulatory crosshairs.

Moreover, there is a business model behind data protection regulation. Not only will Europe have to hire some 75,000 new data protection professionals as regulatory compliance officers,10 regulatory authorities are doubling their staffs and budgets to take on the increased workload of managing compliance and complaints. Just seven hours after the GDPR came into effect in May 2018, Austrian activist Max Schrems lodged complaints against Google and Facebook, demanding $8.8 billion in damages because their services are so popular that they effectively “force” people to use them.11

The politics of data privacy and protection

A decentralized, limited government approach has been empirically shown to better protect data privacy, but regulatory advocates are too powerful, organized, and determined to let well enough alone. They consider themselves the self-appointed protectors of all Americans, who they deem unwitting digital serfs, forced to engage in transactions against their will and too stupid to learn how to be safe online.  While free thinkers value sovereignty and choice, they are diffuse and difficult to galvanize. The sweeping regulations adopted in California and the European Union were enabled by a small yet vocal group of activists.

While the media emphasizes the partisan chaos in Washington, there is a bona fide, fact-based, bipartisan effort within Congress to create a rational policy for consumer online privacy. The Senate Commerce Committee has hosted a series of hearings to gather input from a variety of stakeholders.  In addition, the Trump Administration has tasked key agencies with developing scientific and policy principles that ensure standards and guarantee freedom of choice for individuals while also giving organizations legal clarity and the flexibility to innovate. It may seem counterintuitive that we need more privacy legislation, but in this case, the outcome will be worse for freedom if Congress does not clarify a single national policy.

How a market based approach protects data privacy and protection

The elements of a market based approach includes a consistent national policy which promotes technological innovation, consumer education, and freedom of choice for consumers.                                                                                    

Privacy enhancing technologies. Continuous technological improvement of online systems will always be better than regulatory regimes that rely on bureaucrats to decide how data should be processed and which abuses to adjudicate. Scientific research demonstrates that privacy enhancing innovation (a field including dozens of technologies such as encryption, data minimization, anonymization, attribute based access controls, etc) makes the online experience safer and more private than a bureaucratic approach can. Moreover, soft law instruments such as multi-stakeholder processes, scientific best practices and standards, and codes of conduct can address emerging data protection challenges without resorting to heavy-handed rules. Policymakers should consider the role of incentives for design and experimentation with privacy enhancing technologies (PETs). These can include grantsawards, and competitions. Importantly, a national policy would include legal safe harbor for innovators so that they can experiment without punishment and so that enterprises can be confident that they are complying with the law.

Consumer education. Informed consumers who have the freedom to choose among a robust array of goods and services are the bedrock of a free-market economy. This assumes a marketplace in which there is sufficient information, ease of market entry and exit, and minimal regulatory distortion. Scientific research concludes that the consumer’s level of knowledge about the online experience is crucial when it comes to creating trust online. Notice and consent are meaningless to consumers if they don’t understand the nature of the transactions in which they engage, how online platforms work, and the associated costs, benefits, and alternatives. (See p. 13 of this filing to the Federal Trade Commission for the history of consumer education and models of online privacy education.) Individuals need to take the responsibility to educate themselves about the online services they use and policy-makers must ensure that there are transparent ways for consumers to get access to that information. Moreover, educated consumers are a powerful check on unelected, unaccountable bureaucrats, limiting the need for regulation in the first place.

Choice. Individuals must have freedom of choice over whether to share their data in exchange for a service as well as the ability to say no to terms and conditions which make them feel uncomfortable. When a consumer says no and declines the service, this sends an important message to providers to improve their products and services. A key problem of the California and European rules is that they obligate providers to deliver services even if users object to sharing their data. This perversion creates a free rider problem, which increases the amount of processing that must be performed on consenting users so that the service provider can cover its costs. Moreover, it removes the essential feedback that providers need from users so that they can improve their services.

Flexibility. A recent Senate hearing featured the architect of the California Consumer Privacy Act, Alastair MacTaggart, who took offense that his local Supercuts hair cuttery requested his email and phone upon checking in for an appointment. MacTaggart called it “out of control” and intimated that this practice should be eliminated for all Supercuts customers. (He also spent nearly $3.5 million of his own fortune from a successful real estate business, which, ironically, relies on the same kind of data processing he now wants to eliminate.) This kind of elitism fails to see how many people appreciate SMS reminders for their salon appointments and want to receive email offers of coupons for hair care products, discounts, and so on. 

The situation is a reminder of the need for regulatory flexibility. Consumers who do not want to participate in such programs should not have to, but those who want to should be allowed. Regulatory advocates don’t like the idea that a customer loyalty program has such requirements. They don’t want enterprises to have the flexibility to reward loyal customers. Again, this creates a free rider problem. If enterprises are obliged to make offers available without any minimum requirements, the provider’s incentive for offering the promotional program is thus removed, and the provider pulls the offer. This leads to overall price increases while reducing welfare for the set of customers who wanted the offer in the first place. In any case, there are technological workarounds which can secure privacy without eliminating enterprise, such as anonymizing email addresses and phone numbers. (See p. 11 of the filing for the discussion on anonymization).

Consistency. America’s 50 states are a single market, which is a boon to America’s digital economy. An app posted in Maine can serve a user from Hawaii. However, California’s new privacy law disrupts this seamlessness, inhibiting commerce both inside and the state. Other states (NY, NJ, MD, MA, RI, IL, and CT) are threatening to make their own rules. We need a single federal privacy standard enforced by a single Federal regulator – ideally the Federal Trade Commission.  The FTC can enforce the standard and deliver enforcement with the cooperation of state attorneys general.

Conclusion

The cycle of privacy panic, the manufactured fear that accompanies new technologies, has been a well-documented phenomena for more than a century. When first introduced, photography was maligned for violating one’s privacy. As people experience a new technology, they grow more comfortable with it, ultimately adopting it in a way that demonstrably improves their lives. When asked what has brought the biggest improvement to their lives in the past 50 years, Americans name technology more than any other advancement, notes Pew Research in a 2016 survey.

Today’s debate about the data-driven economy is no different. Market-based solutions can address data privacy concerns without surrendering the internet to government control. If anything, this legislative moment is about reaffirming America’s history of data protection and privacy. We need federal law to stop state level overreach so that the freedom of individuals and enterprises can flourish.


[1] See, e.g., Joe Nocera, The Wild West of Privacy, N.Y. Times, Feb. 24, 2014, https://www.nytimes.com/2014/02/25/opinion/nocera-the-wild-west-of-privacy.html.

[2] See Daniel J. Solove, A Brief History of Information Privacy Law in Proskauer on Privacy (2006).

[3] Daniel Castro and Alan McQuinn, The Economic Cost of the European Union’s Cookie Notification Policy, ITIF, Nov. 6, 2014, https://itif.org/publications/2014/11/06/economic-cost-european-unions-cookie-notification-policy.

[5] European Commission, Better Access for Consumers and Business to Online Goods, May 6, 2015, https://ec.europa.eu/digital-single-market/en/better-access-consumers-and-business-online-goods.

[6] PricewaterhouseCoopers, GDPR Compliance Top Data Protection Priority for 92% of US Organizations in 2017, According to PwC Survey, January 23, 2017, https://www.pwc.com/us/en/press-releases/2017/pwc-gdpr-compliance-press-release.html.

[7] U.S. Census Bureau, 2015 SUSB Annual Data Tables by Establishment Industry, January 2018, https://www.census.gov/data/tables/2015/econ/susb/2015-susb-annual.html.

[8] Jonathan Spalter, Broadband CapEx Investment Looking Up in 2017, USTelecom, July 25, 2018, https://www.ustelecom.org/blog/broadband-capex-investment-looking-2017.

[9] U.S. Census Bureau, Quarterly Retail E-Commerce Sales 1st Quarter 2018, May 17, 2018, https://www.census.gov/retail/mrts/www/data/pdf/ec_current.pdf.

[10] Rita Heimes and Sam Pfeifle, Study: GDPR’s Global Reach to Require at Least 75,000 DPO’s Worldwide, IAPP, https://iapp.org/news/a/study-gdprs-global-reach-to-require-at-least-75000-dpos-worldwide/.

[11] GDPR: Noyb.Eu Filed Four Complaints Over “Forced Consent” against Google, Instagram, Whatsapp and Facebook, noyb (May 2018), https://noyb.eu/wp-content/uploads/2018/05/pa_forcedconsent_en.pdf.