Skip to content

Privacy & Data Protection

Could the end of Section 6(b) of the Consumer Product Safety Act (CPSA) actually be near?  Time will tell.  But last week’s development on Capitol Hill in the saga of “Section 6(b)” is noteworthy, and, one day in the not-so-distant future, may be recognized as the beginning of the end for this controversial provision of the law.

On April 22, Senator Richard Blumenthal (D-CT) and Representatives Jan Schakowsky (D-IL) and Bobby Rush (D-IL) introduced legislation—the Sunshine in Product Safety Act—to fully repeal Section 6(b) of the CPSA.  This is the first time in recent memory that Members of Congress have introduced legislation to do away with Section 6(b) altogether.  For example, in the last Congress, Representative Rush introduced the “SHARE Act,” which sought primarily to scale back one of Section 6(b)’s most important protections for firms—allowing a company to judicially challenge the U.S. Consumer Product Safety Commission’s (“CPSC” or “the Commission”) decision to release information about a firm, or one of its products, prior to its disclosure.  But that legislation left the rest of Section 6(b)’s procedures and protections intact.  This current bill, therefore, is much more ambitious, and stakeholders should take note.

By way of background, Section 6(b) requires the CPSC to engage in certain procedural steps before publicly disclosing information from which the identity of a manufacturer of a product can be readily ascertained.  Those include taking reasonable steps to ensure that the information to be disclosed publicly is fair, accurate, and reasonable related to effectuating the purpose of the product safety laws.  Practically speaking, this means notifying the manufacturer of the potential disclosure, providing either a summary of what the agency intends to disclose, or the actual disclosure itself, and providing the company with the opportunity to comment, typically 15 days, though that time period can be shortened by the CPSC with a “public health and safety finding.”  Other regulators, like FDA and NHTSA, do not have similar statutory constraints on the release of product information nor do they have due process protections around data release, whether those be adverse events or vehicle accidents.
Continue Reading New Bills Seek to Repeal Controversial Provision of Product Safety Act

Last week the Supreme Court unanimously held that §13(b) of the Federal Trade Commission Act does not give the Federal Trade Commission the power to seek equitable monetary relief such as disgorgement or restitution. The Court’s opinion in AMG Capital Management LLC v. Federal Trade Commission removes a powerful tool that the FTC has long relied on to pursue monetary relief for consumers in both consumer protection and competition matters.

By way of background, the FTC has authority to protect consumers from unfair or deceptive acts or practice (“UDAP”) and unfair methods of competition (“UMC”) with an overlapping but distinct set of tools it can use to pursue its dual consumer protection and competition missions:

  • Administrative Proceeding: The FTC can initiate an administrative proceeding to seek a cease and desist order for either a UDAP or UMC violation from an administrative law judge. If necessary, the FTC can later bring a contempt proceeding in federal court seeking to enforce the terms of an administrative order. A defendant may respond by arguing that it has “substantially complied” with the terms of the order. If the FTC prevails in such a case, it can seek civil penalties and other equitable relief necessary to enforce the order (however monetary relief only applies to UDAP violations).
  • Rulemaking: The FTC has authority to promulgate rules that define UDAP with specificity. Generally, this requires a lengthy, formal rulemaking process that allows for public comment, and a final rule can be challenged in federal court. If a defendant later violates a duly enacted UDAP rule, the FTC can seek civil penalties for a knowing violation. The FTC can also file suit in federal court and obtain monetary relief “to redress consumer injury,” including an order compelling “refund of money or return of property,” but only if “a reasonable man would have known under the circumstances [that the challenged conduct] was dishonest or fraudulent.”
  • Federal Court: The FTC can sue in federal court under §13(b) of the FTC Act to enjoin a defendant when the defendant “is violating, or is about to violate” a law that the FTC enforces and such an injunction is in the public’s interest. While courts have historically read §13(b) as giving the FTC an implied right to recover equitable monetary relief in addition to injunctive relief, the Supreme Court’s ruling now limits the FTC to seeking injunctive relief only.


Continue Reading The Supreme Court Limits FTC’s §13(b) Powers

The Virginia Consumer Data Protection Act (CDPA) has become the next major U.S. state privacy law, after being signed into law by Virginia Governor Ralph Northam on Tuesday, March 2, 2021. The new law amends Title 59.1 of the Code of Virginia with a new chapter 52 (creating Code of Virginia sections 59.1-571 through 59.1-581).

Who is covered?

Per Section 59.1-572, the bill applies to “persons that conduct business in the Commonwealth or that produce products or services that are targeted to residents of the Commonwealth” who “control or process personal data of at least 100,000 consumers” or those who “control or process the data of at least 25,000 consumers” AND “derive at least 50% of their gross revenue from the sale of personal data.”

As defined in Section 59.1-571 the bill, “[c]onsumers” are any “natural person who is a resident of the Commonwealth acting only in an individual or household context. [Consumer] does not include a natural person acting in a commercial or employment context.”

Both covered entities and “consumers” are defined more narrowly than under other general data privacy laws such as the California Consumer Privacy Act (CCPA). For example, in contrast to the CCPA’s application to any California business with more than $25 million in annual revenue, the CDPA does NOT apply on a blanket basis to any Virginia business above a specified revenue threshold. To be covered under the CDPA, a person must always process the data of a minimum number of Virginia residents “acting only in an individual or household context.” Additionally, the exemption for individuals acting in “commercial” or “employment” contexts is a complete one, and does not have a “sunset” date where the exemption will expire like the California law.

Notably, the CDPA follows the model established under the EU General Data Protection Regulation and categorizes relevant businesses as “controllers” and “processors.” “Controllers” are “the natural or legal person that, alone or jointly with others, determines the purpose and means of processing personal data,” while “processors” are “a natural or legal entity that processes personal data on behalf of a controller.” Similar to the controller/processor relationship created by the GDPR and the business/service provider relationship created under the CCPA, a CDPA processor must be engaged by a controller via a written agreement that governs the processor’s data processing and provides specific instructions for the processing of data, as well as the nature and purpose of the processing.
Continue Reading Virginia Consumer Data Protection Act (S.B. 1392)

On December 15, 2020, the European Commission (EC) presented its long-awaited proposal for a Digital Services Act (DSA), together with a proposal for a Digital Markets Act (DMA), which we discussed in a previous alert. Whereas the DMA aims to promote competition by ensuring fair and contestable markets in the digital sector, the DSA proposal intends to harmonize the liability and accountability rules for digital service providers in order to make the online world a safer and more reliable place for all users in the EU.

Most notably, the DSA would impose far-reaching due diligence obligations on online platforms, with the heaviest burdens falling on “very large” online platforms (i.e., those with more than 45 million average monthly active users in the EU), due to the “systemic” risks such platforms are deemed to pose in terms of their potential to spread illegal content or to harm society. In this day and age when the perceived power of online platforms to independently control content publication and moderation is headline news daily, with governments throughout the globe grappling with different legislative and regulatory proposals, the DSA stands out as an ambitious effort by the EC to create a consistent accountability framework for these platforms, while striking a balance between safeguarding “free speech” and preserving other values and interests in a democratic society. Like the parallel DMA proposal, the DSA proposal has been criticized for targeting mainly U.S.-based companies, which would make up most of the “very large” platforms. Given the huge commercial interests at stake, the passage of both laws will no doubt be the subject of intense debate and lobbying, including with respect to the asymmetric nature of the proposed regulation and the powerful role that the EC reserves to itself in both proposals.
Continue Reading Digital Services Act: The European Commission Proposes An Updated Accountability Framework For Online Services

On December 18, 2020, the Ninth Circuit Court of Appeals held that “Oh, the Places You’ll Boldly Go!,” a Dr. Seuss and Star Trek mashup illustrated book, is not a fair use exempted from copyright liability. Under the Copyright Act of 1976, the factors courts assess in determining if there is fair use include:

  1. The

On November 30, 2020, New York Governor Cuomo signed into law a bill that will allow estates and representatives of deceased individuals to defend their names and likenesses from commercial exploitation, allowing their estates to continue to control and protect their likeness after their death. The new law, which establishes a “Right to Publicity” for deceased individuals who were domiciled in New York at their time of death, allows these individuals to that have commercial value, including their name, picture, voice, or signature, against unauthorized use.

In connection with the new post-mortem right to publicity, Governor Cuomo stated, “In the digital age, deceased individuals can often fall victim to bad actors that seek to capitalize on their death and profit off of their likeness after they pass away – that ends today. This legislation is an important step in protecting the rights of deceased individuals while creating a safer, fairer New York for decades to come.” The new post-mortem right of publicity applies up to 40 years after the death of the deceased personality, and it provides certain exceptions, such as for works of art or political interest, parodies and satires, and the use of names and likenesses in the news.

In enacting this law, New York joins the minority of U.S. states which recognize a post-mortem right of publicity, an area of law that has long been controversial and which has resulted in extensive discussion of choice-of-law rules.
Continue Reading ‘Imagine’ This: John Lennon Would Have Received Post-Mortem Right to Publicity in New York

Last week, the President signed the Internet of Things (IoT) Cybersecurity Improvement Act into law, kicking off a multi-year process that will culminate in the first-ever federal requirements for IoT devices. Under the law, the National Institute of Standards & Technology (NIST) is now charged with drafting and finalizing security requirements for IoT devices, as

On November 3, 2020, California voters approved California Proposition 24, also known as the California Privacy Rights Act of 2020, or CPRA. The CPRA expands protections afforded to personal information, building off of the California Consumer Privacy Act (CCPA), which took effect in January of this year. While some of the CPRA changes will take effect immediately, most will not become enforceable until July 1, 2023, and apply only to personal information collected after January 1, 2022.

Key Changes to CA Privacy Law

At 54 pages long, the CPRA makes numerous changes to the CCPA, ranging from minor revisions to the introduction of new concepts and the creation of several new consumer rights. Some of the most impactful changes are discussed below. A series of future client alerts will explore the nuances of these changes in greater detail.

Sensitive Personal Data

The CPRA establishes new rules for a category of “sensitive personal information,” which includes, for example, genetic data and religious or philosophical beliefs, and is defined as personal information that reveals:

(1)

  1. a consumer’s social security, driver’s license, state identification card, or passport number;
  2. a consumer’s account log-in, financial account, debit card, or credit card number in combination with any required security or access code, password, or credentials allowing access to an account;
  3. a consumer’s precise geolocation;
  4. a consumer’s racial or ethnic origin, religious or philosophical beliefs, or union membership;
  5. the contents of a consumer’s mail, email and text messages, unless the business is the intended recipient of the communication; and
  6. a consumer’s genetic data; and

(2)

  1.  the processing of biometric information for the purpose of uniquely identifying a consumer;
  2.  personal information collected and analyzed concerning a consumer’s health; or
  3.  personal information collected and analyzed concerning a consumer’s sex life or sexual orientation.

This definition is among the most impactful changes in the CPRA, given the breadth of data that it sweeps in, along with the creation of new disclosure and opt-out rights associated with “sensitive personal information.” These changes will likely require covered businesses to dive into their data, map it, and ensure they are compliant.

In addition, the CPRA creates a right for consumers to “limit use and disclosure of sensitive personal information.” Similar to existing CCPA opt-out rights, beginning in 2023, consumers may direct businesses that collect sensitive personal information to limit its use to that “which is necessary to perform the services or provide the goods reasonably expected by an average consumer” or to perform a small subset of specifically identified exempt services. Significantly, exemptions to the opt-out will include short-term, transient advertising, and “performing services on behalf of the business,” but not general advertising and marketing, nor long-term profiling or behavioral marketing technologies.
Continue Reading CCPA 2.0? California Adopts Sweeping New Data Privacy Protections

A proposed law issued by the People’s Republic of China (PRC) on October 21, 2020, the draft Personal Information Protection Law, seeks to impose restrictions on entities and individuals, including those operating outside of China, that collect and process personal data and sensitive information on subjects in China. The proposed law also provides for penalties

This article was originally published in Automotive World.

The future of the mobility is dependent on AI, but without greater understanding among consumers, trust could be hard to build.

The mobility sector is keen to realise the full benefits of artificial intelligence (AI), not least to open up the revenues which data-driven connected services could offer. But moving forward, it must balance these opportunities with the rights of drivers, passengers and pedestrians. A number of concerns have already surfaced, all of which will become more pressing as the technology is further embedded into vehicles, mobility services and infrastructure.

Privacy and liability are two of the major challenges. As Christian Theissen, Partner, White & Case explains, mobility has become inherently connected to consumer habits and behavioural patterns, much like the e-commerce and social media industries. “The access, ownership, storage and transmission of personal data, such as driving patterns, must be taken into consideration by both lawmakers and companies gathering and using data,” he says. Meanwhile, in a world of AI-powered self-driving, at what point do regulators start blaming the machine when something goes wrong?

Part of the challenge in considering these issues is that as things stand, there is limited understanding among consumers around what rights there are. “Consumers appreciate AI,” says Cheri Falvey, Partner, Crowell & Moring, “and in particular the ease with which navigational apps help guide them to their destination. Whether they appreciate how their data is accumulating and developing a record of their mobility patterns, and what their rights are in respect to that data, is another question.”

There is often little precedent for regulators to rely on when making new policy in this arena, so it’s a good time to create a proactive regulatory strategy that invites discussion and collaboration from the start

This is in part because it is not always clear when AI is at work. A driver may register when a car’s navigation system learns the way home, but won’t necessarily realise that data on how a car is driven is being collected for predictive maintenance purposes, or that their data is being fed into infrastructure networks to manage traffic flow.


Continue Reading Automakers and Regulators Must Educate Consumers on Mobility AI