On March 15, 2021, California brought “dark pattern” prohibitions squarely into the mainstream of American privacy and software design law. The new law will affect some U.S. companies directly, and a far larger number of companies indirectly.
“Dark patterns” refer to software interface design features that, according to many behavioral scientists, can effectively deceive users about the consequences of choices they make in interacting with the products. Designers have used the “dark patterns” term for years, but the phrase gained particular attention after its use in a 2018 report by a Norwegian consumer advocacy group. That report charged several popular companies with using misleading interface features to secure users’ opt-in to privacy practices they did not truly understand. Since then, a variety of U.S. regulators have shown interest in this issue. A bipartisan group of U.S. Senators has introduced legislation to regulate misleading “dark pattern” practices, and the Federal Trade Commission, America’s primary enforcer of privacy laws, recently announced workshops to discuss possible upcoming regulations. California has now moved forward to address “dark patterns” directly in its state regulations.
On March 15, 2021, California’s Attorney General, Xavier Becerra, announced that the final set of regulations implementing the California Consumer Protection Act (CCPA) had taken effect. While the CCPA took effect in 2020, it allowed the Attorney General to publish some further regulations supplementing its provisions. (The CCPA is distinct from the California Privacy Rights Act (CPRA), which takes effect in 2023, and the California Online Privacy Protection (CalOPPA), which has been the state’s main privacy law since 2004.)
The rule announced on March 15 fleshes out the consumer opt-out provision that appeared in the CCPA itself. That provision requires U.S. companies that are subject to the law to honor requests by consumers to opt-out of the companies’ sale of the consumers’ information to third parties. The March 15 regulation clarifies what this means. In the rule’s language: “A business’s methods for submitting requests to opt-out shall be easy for consumers to execute and shall require minimal steps to allow the consumer to opt-out. A business shall not use a method that is designed with the purpose or has the substantial effect of subverting or impairing a consumer’s choice to opt-out.” (emphasis added). The rule gives one example of a forbidden design: a double-negative dialog box reading (“Don’t Not Sell My Personal Information”). It is not hard to imagine other choices that might violate the law, such as an opt-out that requires multiple clicks on different web pages or requires consumers to reply to a follow-up email. Both approaches could result in consumers not realizing that a second action was required to execute what they intend. It is important to note that a company can violate this rule even without deliberately trying to thwart consumers; the rule looks at intent as well as the actual effects of the design choices.
The March 15 rule applies only to companies that are subject to the CCPA. The test for determining applicability is complex. The relevant test for most companies is whether (1) their annual gross revenue exceeds $25 million, (2) they collect personal information relating to California residents (which most large companies with websites do) and (3) they do business in California. (“Doing business in California” is in itself a nuanced test, but suffice to say that many large companies that have no operations in California satisfy this test by having significant sales in the state or owning assets there.) If all of the above are true, the company likely must comply with the CCPA.
In addition to those companies directly covered by the CCPA, a far broader category of U.S. businesses may soon need to comply with its major principles. This is because of flow-down obligations increasingly imposed on smaller businesses by larger companies that are subject to the CCPA. As companies become governed by the CCPA, they often must ask their outside vendors and suppliers likewise to comply with certain CCPA provisions. And this all comes at a time when larger U.S. companies were already imposing similar obligations on vendors and suppliers to comply with Europe’s General Data Protection Regulation (GDPR), which contains many provisions similar to the CCPA. For good measure, the Virginia Consumer Data Protection Act that became law earlier this month also has features similar to the CCPA and GDPR.
The net of all of these laws is that more and more U.S. businesses will adopt practices similar to those required by the CCPA’s dark pattern rule – not because the law requires them to, but because their business partners do. This is true even for companies that do not think of themselves as software companies. Your local brewery’s card-swiping terminal shares your customer data with a host of payment companies. Your grocery store’s new mobile app does the same. Your nearest hotel likely uses booking software that passes your data back and forth to aggregating websites. The CCPA’s dark pattern rule is just one state’s law today, but will be widely followed soon.
The March 15 CCPA Final Regulation is available here.
For additional guidance, see:
- Linked Privacy Claim Survives a Motion to Dismiss
- FTC Investigations: What to Expect When the FTC Comes Calling
- A Legal Checklist for Early-Stage Tech Companies
Perkins Thompson routinely counsels American businesses on privacy and other technology matters. Please reach out to Adam Nyhan in our Intellectual Property & Technology Group if you have any questions.