P&C the March 2019 issue

Privacy’s Perilous Path

Legal use does not always equate to ethical use.
By Jody Westby Posted on March 1, 2019

Facebook got everyone’s attention in March when The New York Times and The Guardian revealed that Cambridge Analytica used the personal data of more than 50 million Facebook subscribers to help the Trump campaign.

A former-employee-turned-whistleblower revealed that Facebook never audited the application developers it allowed to access its data to confirm they were using the data according to terms. Facebook subsequently announced it would conduct a thorough review of all application developer use of its data.

The drumbeat on privacy in the United States was enhanced with congressional hearings that probed Facebook on its data-sharing practices. The controversy revealed how 126 million Facebook users might have been played by Russians in an attempt to influence the 2016 presidential election. A few months later, the Times reported that Facebook had allowed numerous device manufacturers, including Amazon, Apple and Samsung, access to user data without Facebook users’ explicit consent, an apparent violation of a Federal Trade Commission consent decree. Then, late last year, the Times obtained documents indicating that Facebook had entered into agreements with at least 150 companies to share its data, including Amazon and Microsoft.

Companies face real risks and perhaps internal disagreement when trying to balance their customers’ privacy expectations and maximize profits.

All the attention fueled investigations over how much of Facebook’s data—and other social media data—are shared with third parties. It also raised questions on what and when Facebook knew about Russia’s manipulation of its platform and users. The Times reported in late November that Facebook’s senior leaders were deliberately trying to keep what it knew about Russia’s tactics under wraps. The company’s directors pushed back on that report, claiming they pressed CEO Mark Zuckerberg and COO Sheryl Sandberg to speed up its Russia investigation and calling allegations that the two executives ignored or hindered investigations as “grossly unfair.”

By mid-2018, online users (that is, all of us) were finally beginning to understand the power of big data. Yet they also realized they really had no idea how every digital fingerprint they leave in texts, emails, Facebook posts, tweets, Google searches, etc., was being shared with others. A Pew Center report in September indicated that more than half of Facebook users changed their privacy settings, 40% took a break from Facebook, and 25% deleted the Facebook app on their phone.

Important lesson: privacy expectations can be more powerful than laws, because its hammer is market forces, not fines or penalties. After the Cambridge Analytica scandal, Facebook was forced to report lower-than-expected earnings. Within hours, Facebook lost $130 billion in market value.

Meanwhile, on May 25, 2018, the European Union’s General Data Protection Regulation took effect, forcing companies to focus on what data they have, where they get it and who accesses it. Shortly thereafter, California enacted the California Consumer Privacy Act of 2018, which takes effect next Jan. 1. The law is similar to the European Union’s data protection regulation, but there are key differences. For example, the California law does not require consent to process personal information and does not include the right to be forgotten or to have data corrected—two important features of the EU regulation. Nevertheless, California’s law is as close as any U.S. law has come to emulating EU privacy requirements, a development that thrilled privacy advocates and scared companies.

Ethics of Data Sharing

Another topic that emerged last year was the ethics of data sharing. Wired ran a story last July headlined “Was It Ethical for Dropbox to Share Customer Data with Scientists?” In a Harvard Business Review article, Northwestern University researchers revealed they obtained data from Dropbox and analyzed the data-sharing and collaboration activities of tens of thousands of scientists from over 1,000 universities. Dropbox justified its sharing of this data by relying on its privacy policy and terms of use. The ensuing uproar caused Dropbox and the researchers to clarify that the data had been anonymized and aggregated prior to their obtaining it. Others, however, pointed out how folder structures and file names could still be used to identify individuals. Dropbox was in the hot seat.

The Cybersecurity Division of the Homeland Security Advanced Research Projects Agency funded a multi-year project examining the ethics associated with the use of communications traffic data by cyber-security researchers. The resulting report, known as The Menlo Report, published in 2012, was an early attempt to establish parameters for the ethical use of personal data in cyber-security research projects.

In 2019, organizations would be wise to analyze the data they buy, share, use and store, to examine their legal basis to do so, and to consider that their customers might have contrary privacy expectations.

The ethics of data sharing is not always consistent. When a researcher finds a trove of data in a cyber criminal’s online cache, the temptation to use the data is probably no less compelling than when Uber was offered Lyft customer receipts in 2017 by Unroll.me. A privacy policy or terms-of-use statement might give you legal cover for data sharing, but the users whose data you share—or buy—might question your ethics.

Accenture has studied the ethics of digital data and developed 12 “universal principles.” These include:

  • Maintain respect for the people who are behind the data.
  • Create metadata to enable tracking of context of collection, consent, data integrity, etc.
  • Attempt to match privacy expectations with privacy controls.
  • Do not collect data simply to have more data.
  • Listen to concerned stakeholders and minimize impacts.
  • Practice transparency, configurability and accountability.

Companies face real risks and perhaps internal disagreement when trying to balance their customers’ privacy expectations and maximize profits. Remember that Sheryl Sandberg was reported to favor keeping quiet the discoveries of Russian interference and the exploitation of user data while the chief information security officer at the time favored more public disclosure. Two University of Colorado researchers studied the public reactions to the sale of Lyft customer receipts to Uber and WhatsApp’s announcement in 2016 that it would share data with Facebook to improve Facebook ads and user experience. Their conclusion is noteworthy.

Our findings also point to the importance of understanding user expectations when it comes to privacy; whether most users agree that it’s okay to be the product or not, shaping expectations with more transparency could help reduce the frequency of these kinds of privacy controversies.

But relying on privacy policies or terms of service can be a perilous path. User expectations of privacy will often prevail over legalese. And no one can really keep a straight face and say they believe their users actually read their privacy policy or terms of service. The events of 2018 struck a note of outrage in online users, and legislators, regulators and plaintiff’s attorneys are paying close attention.

In 2019, organizations would be wise to analyze the data they buy, share, use and store, to examine their legal basis to do so, and to consider that their customers might have contrary privacy expectations. Legal use may still violate a person’s expectation of privacy and thus be viewed as an unethical use. Agents and brokers should encourage their clients to be forward thinking on this issue and proactively manage potential privacy risks associated with their data or the data they may obtain from third parties.

More in P&C

Property & Casualty Hard Market Turns 6
P&C Property & Casualty Hard Market Turns 6
It may not happen immediately, but signs point to softening of P&C rates.
P&C Small Business Cyber Risk Represents a Big Opportunity for Agents
Q&A with Joshua Parrish, Executive Vice President at RT Specialty
Sponsored By RT Specialty
Broker Playbook for Flood Risk
P&C Broker Playbook for Flood Risk
Your clients must take steps to mitigate, prepare for and quickly respond to flo...
Lifestyles of the Rich and Risky
P&C Lifestyles of the Rich and Risky
Affluent insurance customers may not be protecting themselves against increasing...
Farm Bill Idles
P&C Farm Bill Idles
Congress will need to overcome election-year paralysis to fi...
Premium Increases Slowed But Challenging Conditions Remain
P&C Premium Increases Slowed But Challenging Conditions Remain
The Council’s Commercial P/C Market Index for Q4 is here.