ACLU Urges FTC to Address the Many Commercial Surveillance Practices that Disempower and Harm Consumers
WASHINGTON — The American Civil Liberties Union submitted its comment in response to the ’s call for input from the public about “whether new rules are needed to protect people’s privacy and information in the commercial surveillance economy.”
“The ACLU supports FTC rulemaking to rein in commercial surveillance, not by burdening users with the impossible task of managing their own data as it flows through the complex web of advertisers, data brokers, government agencies, and other parties who buy and sell it for their own benefit, but by changing the paradigm and demanding that companies use consumer data in service of consumers,” the comment reads. “Strong rules that go beyond the ‘notice-and-choice’ paradigm are the only way to address the serious harms that consumers experience under the current abusive system of commercial surveillance.”
“Companies have tremendous profit motives to collect as much personal information as possible and monetize it in increasingly invasive ways, to the detriment of ordinary people,” said Nathan Freed Wessler, deputy director of the ACLU’s Speech, Privacy, and Technology Project. “Neither the patchwork of mostly outdated and weak state and federal privacy laws, nor so-called ‘self-regulation’ by companies, have kept up with today’s dire state of affairs. The FTC’s serious interest in how to better protect people against harm is heartening, and we urge the commission to continue to move deliberately toward enshrining regulations with real teeth.”
The FTC's Advanced Notice of Proposed Rulemaking asked almost 100 questions about practices related to commercial surveillance, data security, and algorithmic discrimination. The ACLU’s comments highlighted its support for policies that:
- Ensure that harmful data uses are prohibited;
- Require audits to identify and address algorithmic bias, discrimination based on protected class status, or other negative outcomes;
- Impose purpose limitations and data minimization requirements that prevent exploitation of consumer data for purposes unrelated to that for which the data was provided;
- Communicate clearly with users and obtain informed, opt-in consent to confirm that data practices match consumer expectations – while also making clear that consent rules cannot replace strong substantive limits on abusive practices, including data minimization and purpose limitations;
- Require companies to adopt a comprehensive framework to govern the use of automated decision-making systems;
- Protect data from misuse by implementing appropriate privacy and security practices;
- Maintain transparency so consumers can understand how their data is used;
- Surface harms that individuals are not equipped to recognize and ensure they are remedied, including harms from discrimination based on protected class status; and
- Address the potential for consumer harm by government surveillance.
The ACLU’s comments also highlighted its opposition to policies that clearly fail to protect consumers’ privacy, such as:
- Opt-out “consent,” which not only disregards the infeasibility of burdening consumers with identifying and managing their own privacy but also incentivizes companies to “innovate” by concocting new schemes to deter consumers from exercising this right; and
- De-identification or pseudonymization, which far too frequently provides inadequate protections against many privacy harms due to the well-documented ability to re-identify information and does not address the dignitary and community-oriented harms that may surface even from truly anonymous data.
The ACLU’s comments also urge the FTC to enact binding rules to identify and prevent algorithmic discrimination. The FTC should exercise its authority to address unfair practices to regulate the myriad ways in which consumers are harmed by discrimination enabled by commercial surveillance and automated systems, and it should do so concurrently with other federal agencies that may focus on sector-specific laws, such as in housing or employment.
“Automated decision-making tools are often built and deployed into systems marked by entrenched discrimination, where they can enable or exacerbate serious harm and leave consumers with no opportunity for recourse.” said Marissa Gerchick, ACLU Data Scientist and Algorithmic Justice Specialist. “In adopting new rules, including requirements that companies continuously and independently audit their automated decision-making systems, the commission can provide consumers with much needed protections from algorithmic error and algorithmic discrimination.”