The momentum for data privacy law in the U.S. has been driven by several key events, from Edward Snowden's controversial disclosure of widespread NSA surveillance to the Cambridge Analytica scandal.
But perhaps more than any other factor, the EU's General Data Protection Regulation (GDPR) has spurred a global privacy movement propelled not only by journalists, activists, and legislators, but also a public that has grown weary of incessant data breaches.
Other than specific industry regulations (e.g., HIPAA, PCI, FERPA) U.S. privacy issues at the federal level are generally the purview of the Federal Trade Commission (FTC). And while numerous comprehensive state privacy bills have been introduced, only a few have been signed into law.
In this article we'll take a look at some key state privacy laws, consider what they have in common, and explore how those features might inform an all but inevitable federal privacy law.
According to Gartner, by 2022, half of our planet's population will have its personal information covered under local privacy regulations in line with the GDPR, up from a tenth today (report available to clients). If that's true, the following laws are only the beginning for comprehensive U.S. privacy regulation.
California's AB-375, commonly known as the California Consumer Privacy Act (CCPA), was signed on June 28, 2018, and will take effect January 1, 2020. The CCPA establishes several new rights for California residents and regulates the data privacy practices of businesses that fall under specific categories.
Due to California's size and influence, the CCPA will make an impact extending well beyond the state's border. In fact, Microsoft has already announced it will honor the CCPA across the entire U.S.
The CCPA defines personal information as that which “identifies, relates to, describes, is capable of being associated with, or could reasonably be linked, directly or indirectly, with a particular consumer or household." Section 1798.140 of the bill goes into granular detail to include elements such as biometric and geolocation data while intentionally leaving the list non-exhaustive to establish a truly comprehensive interpretation of personal information.
CCPA fines reach up to $7,500 per record violated and the law establishes a private right of action that allows consumers to sue in response to violations, which we'll discuss in more detail below.
To learn more, read our recent report: CCPA Requirements You Should Know About
Known as An Act To Protect the Privacy of Online Customer Information, Maine's LD 946 was signed on June 7, 2019 and protects the online privacy of the state's residents. The bill goes into effect in July, 2020 and will focus on internet service providers (ISPs). Interestingly, it takes an opt in—rather than opt out—approach to data privacy. That means consumer data is not sold by default and consumers must opt in to having their data sold.
LD 946, defines personal information as identifying information such as name and government identification number. However, the bill goes a step further to include internet usage such as browsing history, application usage, equipment identifiers, and IP addresses.
The law forbids ISPs from penalizing any customers who deny consent and also prohibits ISPs from offering incentives, such as a discount, in exchange for providing consent. The law is unclear about a private right of action—an issue that will eventually be decided by Maine's courts.
On October 1, Nevada's opt-out law (SB220) went into effect regulating the privacy practices of websites and online services that handle the data of Nevada's consumers. The law requires consumers be provided with the ability to opt-out of the sale of their personal information with responses required within 60 days.
The Nevada law amends the state's existing data breach law (603A) which defines personal information as the consumer's name in combination with other specified unencrypted data elements:
Social security number
Driver's license number
Account/credit card number
Medical identification or health insurance number
Information that would allow access to an online account.
If a company is found in violation of the law, Nevada's Attorney General may seek an injunction or issue a penalty of $5,000 per violation. The law clearly states that it does not establish a private right of action.
In May, 2018, Vermont's H764 went into effect, becoming the first law is the first in the United States enacted specifically to regulate data brokers. Under the law, a data broker is defined as a business that “knowingly collects and sells or licenses to third parties the brokered personal information of a consumer with whom the business does not have a direct relationship."
Under H746, any business that fits the data broker definition (with a few exceptions) must register with the state with penalties reaching up to $10,000 per year for non-registration. The law also prohibits fees for credit freezes following a data breach and makes it illegal to acquire personal data “through fraudulent means or with the intent to commit wrongful acts."
Vermont's data broker law protects consumers who live in the states and defines their brokered personal information as (abridged):
Date of birth
Place of birth
Mother's maiden name
Unique biometric data
Name or address of immediate family or household
Government identification numbers
Other information that could reasonably identify the consumer.
While the law establishes security standards and increases transparency of the data broker industry, it does not include any way for consumers to access their data or opt out of its sale. It also does not allow a private right of action.
OK, so this isn't a state law, but New York City has a population of about 8.6 million—more than 38 entire states—so I'm including it. In 2018, New York passed Local Law 49, the nation's first law to regulate the algorithms used in automated decision-making.
The bill established a task force to study the impact of automated decision-making on the allocation of city resources and services. The law was also enacted to determine how algorithmic bias might result in the unfair treatment of residents.
Regulating algorithms is challenging because they are typically designated as trade secrets by the companies that design them. And while some have assailed the law as flawed, it is a rare example of legislators attempting to keep up with rapidly advancing technology.
Several states have established rights that empower consumers to determine what type of personal information companies collect, opt out of its sale, and delete it upon request. These features should form a baseline for a federal privacy law.
However, a truly comprehensive federal privacy law must go beyond basic protections and incorporate the following components already established by state privacy laws.
The definition of personal information varies from state to state but is critical in determining the effectiveness of a data privacy law. If personal information is defined strictly, such as under Nevada's law, many companies will find ways to identify consumers in other ways to continue tracking their behavior and targeting them with ads.
Browser fingerprinting collects unique identifiers such as HTTP headers, device type, screen resolution, operating system, and myriad other attributes to build a profile that could only be you. This is inferential information that can tell a data broker everything it needs to know about your online behavior, thwarting your attempt at privacy.
Browser fingerprinting is prohibited by the GDPR which considers it “personal data processing" and under the CCPA, it likely falls under the broad definition of “internet or other electronic network activity information." However, if personal data is defined too explicitly, it reduces the protections afforded to consumers. Nevada's SB220, for example, does not appear to restrict browser fingerprinting due to its precise definition of personal information.
Senator Ed Markey's draft of a federal privacy bill describes personal information as that which “directly or indirectly identifies, relates to, describes, is capable of being associated with, or could reasonably be linked to, a particular individual." This language aligns with the CCPA's all encompassing definition of personal information as that which can be reasonably associated with a consumer.
A thorough federal privacy law must define personal information broadly to account for insidious privacy abuses and remain relevant as technology and marketing strategies evolve.
A pivotal feature of any privacy law is the inclusion of a private right of action. That means private citizens have the right to take legal action against those in violation of a particular law. Without this feature, the consumer's only recourse is to wait for regulators to issue fines—which often aren't effective—and hope the company changes its privacy violating ways.
For example, Section 20 of the Illinois Biometric Information Privacy Act allows for a private right of action. That provision has recently led to a $35 billion class action lawsuit against Facebook related to its “tag suggestions" facial recognition feature.
The CCPA's private right of action (section 1798.150) is fairly narrow, allowing for statutory damages of between $100 and $750 per consumer per violation. Furthermore, the terms only relate to data that is non-redacted or unencrypted. To be eligible for damages, the consumer must first notify the offending company of the infringement and allow 30 days for it to “cure" the violation.
If the violation is not “cured", the consumer must then notify and wait on the Attorney General to either prosecute the company in question or respond that the consumer may proceed with the action. The limited individual damages and complex process might make the private right of action highly conducive to class-action lawsuits.
Many of the federal privacy laws circulating in Congress defer to the FTC for enforcement, which isn't exactly reassuring. Furthermore, forced arbitration clauses—commonly buried in online user agreements—often rob consumers of their right to seek damages.
An effective federal privacy law must explicitly allow for a private right of action that empowers consumers to directly seek damages for privacy violations.
Rather than decisions made by humans, the public is increasingly at the mercy of opaque, and often unaccountable, automated decision-making (also referred to as algorithmic decision-making). Algorithms are now used to inform everything from hiring decisions to criminal risk assessments. Left unchecked, these systems can lead to racial, gender, and other biases that result in unfair treatment and negatively impact society.
Attempts are now being made to resolve these issues and improve algorithmic transparency. The GDPR includes a right to explanation which has been construed as a check on automated decision-making. Recently, the state of Washington became the first state to introduce legislation to address automated decision-making. And in addition to New York City's aforementioned Local Law 49, the state of New York has proposed legislation to study and make recommendations regarding the impact of AI, robotics, and automation.
On April 10, 2019, the Algorithmic Accountability Act was introduced by Sen. Cory Booker, Sen. Ron Wyden, and Representative Yvette D. Clarke. The federal bill aims to address the “risks posed by the automated decision system to the privacy or security of personal information of consumers and the risks that the automated decision system may result in or contribute to inaccurate, unfair, biased, or discriminatory decisions."
An robust federal privacy law must not overlook the societal impact of companies and governments making decisions based solely on algorithms with no human review.
Another feature shared by several state privacy laws is the protection from adverse consequences as a result of invoking privacy rights. These protections generally relate to the refusal of service or the application of a penalty, such as a price increase.
That means, if you were to opt out of your ISP's data sharing program, that company is not allowed to increase your monthly subscription rate as a result. Similarly, Maine's privacy law includes a provision that prohibits offering incentives for consent. In other words, your ISP can't offer a free month of service in exchange for opting in to data sharing.
A thoughtful federal privacy law must restrict practices that manipulate consumers into trading privacy for access to services.
On October 14, an op-ed threateningly titled Americans Will Pay a Price For State Privacy Laws, appeared in the New York Times. The piece argues that state privacy laws are too complex and that the only solution is a preemptive federal privacy law.
The op-ed was written by the Internet Association, an industry lobbying group comprising Google, Facebook, Amazon, and dozens of other tech companies that see the CCPA and other emerging state privacy laws as a threat to their targeted advertising based business models. That's why they fought to weaken the CCPA before embracing a federal policy.
Several tech leaders such as Jeff Bezos and Tim Cook have called for a federal privacy law. In fact, Mark Zuckerberg published an op-ed in the Washington Post earlier this year stating “New privacy regulation in the United States and around the world should build on the protections GDPR provides." It seems Zuckerberg has changed his tune since declaring privacy to no longer be a “social norm" way back in 2010.
On October 17, Sen. Ron Wyden formally introduced the Mind Your Own Business Act, a comprehensive federal privacy bill that includes prison time for executives and steep fines for companies found misusing personal information. Wyden's proposal is just one of numerous federal privacy bills that have been introduced in the last year. Meanwhile, data privacy and antitrust issues involving Big Tech are becoming hot topics in the 2020 election cycle.
A federal privacy law is sure to be enacted in the next few years, but its effectiveness will ultimately depend on the outcome of the 2020 election, how much influence is wielded by industry lobbyists, and whether it builds upon existing state privacy laws.
This document, while intended to inform our clients about state privacy laws, is in no way intended to provide legal advice or to endorse a specific course of action.