Note: This article originally appeared on the J.C. Neu & Associates Blog
This is part 2 of highlights from the FTC’s December 7th Privacy Roundtable. Part 1 covered the panel on "Exploring Existing Regulatory Frameworks." This post highlights comments from "Benefits and Risks of Collecting, Using, and Retaining Consumer Data." This session was moderated by Jeffrey Rosen of The George Washington University Law School and Chris Olsen, of the FTC’s Division of Privacy and Identity Protection.
- Leslie Harris, President and CEO of the Center for Democracy & Technology emphasized that information taken out of context can be used to unfairly judge a person, such as a search for "marijuana" or a medical condition. The danger increases when a rich profile of search terms and surfing data is constructed over time.
- Susan Grant, Director of Consumer Protection for the Consumer Federation of America, said that "privacy is a fundamental human right."
- Alessandro Acquisti of Carnegie Mellon University, Heinz College, explained that the definition of "sensitive information" continues to change with technology, new uses for information, and new ways to correlate and aggregate personal information. Technology cannot stop re-identification or de-anonymization, but should be used to increase the transaction costs for re-identifying personal information. He also spoke about how companies are bypassing consumer efforts to maintain privacy and anonymity through technologies such as flash cookies.
- Richard Purcell, CEO of the Corporate Privacy Group, emphasized that citizens’ health depends on anonymized health data used for research, and that privacy must be weighed using a cost-benefit analysis. He further
- Michael Hintze, Associate General Counsel for Microsoft, explained that companies use log and use information for a number of legitimate reasons, such as security analysis, and search result optimization. However, he admitted that search terms can reveal individuals’ "innermost thoughts," and that anonymization is not a silver bullet to protecting users. Instead, retention and deletion policies such as Microsoft’s policy of deleting IP addresses and cross-cookie session information is designed to truly anonymize search data.
- David Hoffman, Director of Security Policy and Global Privacy Officer for Intel, stressed that we should focus on data minimization. "We have wasted time arguing about what constitutes PII," when the question should be, "what information will have an impact on an individual?"
- Jim Harper, Director of Information Policy Studies for The Cato Institute, argued that regulating too early can stifle innovation and prevent consumers from determining what they want themselves. Instead, we should attempt to define the problem set first. Mr. Harper explained that "there is a role for trial and error in determining what the real problems are," and that intellectualizing what consumers really want can lead to problems. Instead, we should "let a thousand flowers bloom" and let the social systems and advocacy draw out and solve the real issues.
- David Hoffman generally agreed that we don’t want to frustrate innovation, and that we are not currently in a position to understand all of the problems ourselves. He explained that it took a room full of experts the better part of a day to map out data flows. "We can’t expect consumers to understand how the data flows if experts can’t understand it now."
- Leslie Harris and Alessandro Acquisti said that Notification and transparency is necessary but not sufficient. Mr. Acquisti noted that consumers make decisions which are harmful to long-term privacy because humans are bad at making decisions when the benefits are short-term but harm is long-term. He compared privacy erosive behavior to smoking, since each smoker realizes that smoking causes cancer, but any individual cigarette doesn’t hurt much.
- Susan Grant explained that consumers don’t realize that their information can be used for other purposes, and that the benefits of marketing do not outweigh privacy concerns and fraud and abuse. Jim Harper countered that advertisers can introduce a new medication to vulnerable populations, and that denying them that opportunity can create silent harms. Michael Hintze added that niche ads aren’t good or bad- they’re responsible or irresponsible
- Richard Purcell also argued that companies should spend the time and money to train their customers, and create "privacy by design" rather than "privacy by default." Finally, the FTC should "regulate the hell out of" lazy companies and bad actors.
- Richard Purcell further emphasized that we lack a cohesive taxonomy for discussing privacy, and that we need to better define concepts such as "anonymity," "deidentification," and "sensitive data."
- The panel was asked to consider widespread customer blacklisting. Susan Grant said that consumers need tools to discover and amend secret "bad customer" lists, since they have none now. Distinctions based upon invisible information is bad for consumers. Leslie Harris agreed, saying that we need a law that provides access and correction for data brokers as well. She also criticized the FTC for failing to investigate privacy violations, saying that all of our bad examples are "accidental," not intentional long-term decisions to violate privacy, outed by the FTC.
- In the larger context, Jim Harper said that Government access to personal information is the elephant in the room that nobody has yet addressed. Governments are beginning to discover "the cloud" for their own purposes, and when data is available to government on the current terms, it constitutes surveillance on a massive scale.
I’ll do a few more installments in the coming days.