Skip to content

On August 20, 2021, China’s national legislature passed the Personal Information Protection Law (“PIPL”), which will become effective on November 1, 2021. As China’s first comprehensive system for protecting personal information, the PIPL is an extension of the personal information and privacy rights enshrined in China’s Civil Code, and also a crucial element of a set of recent laws in China that seek to strengthen data security and privacy. Among other things, the PIPL sets out general rules for processing and cross-border transfer of personal information. A number of provisions, notably various obligations imposed on data processors, restrictions on cross-border transfer, and hefty fines, will have significant impact on multinational corporations’ HR activities, including recruitment, performance monitoring, cross-border transfers, compliance investigations, termination of employment relationships, and background checks.

This alert will highlight specifically how the PIPL will apply to workplace scenarios in China and provide suggestions to help ensure data privacy compliance for multinational corporations’ China labor and employment operations.

Employee Consent and Exceptions to Consent

Under Article 4 of the PIPL, “personal information” is defined broadly as information related to natural persons recorded electronically or by other means that has been used or can be used to identify such natural persons, excluding information that has been anonymized. Specific types of personal information have been noted for additional protection under Article 28 of the PIPL as “sensitive personal information”. Sensitive personal information is defined under the law as personal information that is likely to result in damage to the personal dignity, physical wellbeing or property of any natural person, and includes, among others, information such as biometric identification, religious belief, special identity, medical health, financial account, physical location tracking and whereabouts, and personal information of those under the age of 14.
Continue Reading Employee Personal Information Protection in China – Are You Up to Speed?

On December 15, 2020, the European Commission (EC) presented its long-awaited proposal for a Digital Services Act (DSA), together with a proposal for a Digital Markets Act (DMA), which we discussed in a previous alert. Whereas the DMA aims to promote competition by ensuring fair and contestable markets in the digital sector, the DSA proposal intends to harmonize the liability and accountability rules for digital service providers in order to make the online world a safer and more reliable place for all users in the EU.

Most notably, the DSA would impose far-reaching due diligence obligations on online platforms, with the heaviest burdens falling on “very large” online platforms (i.e., those with more than 45 million average monthly active users in the EU), due to the “systemic” risks such platforms are deemed to pose in terms of their potential to spread illegal content or to harm society. In this day and age when the perceived power of online platforms to independently control content publication and moderation is headline news daily, with governments throughout the globe grappling with different legislative and regulatory proposals, the DSA stands out as an ambitious effort by the EC to create a consistent accountability framework for these platforms, while striking a balance between safeguarding “free speech” and preserving other values and interests in a democratic society. Like the parallel DMA proposal, the DSA proposal has been criticized for targeting mainly U.S.-based companies, which would make up most of the “very large” platforms. Given the huge commercial interests at stake, the passage of both laws will no doubt be the subject of intense debate and lobbying, including with respect to the asymmetric nature of the proposed regulation and the powerful role that the EC reserves to itself in both proposals.
Continue Reading Digital Services Act: The European Commission Proposes An Updated Accountability Framework For Online Services

On December 15, 2020, the European Commission (EC) published its proposal for a Digital Markets Act (DMA). The proposal aims to promote fair and contestable markets in the digital sector. If adopted, it could require substantial changes to the business models of large digital platform service providers by imposing new obligations and prohibiting existing market practices. These changes not only would create significant new obligations on “gatekeeper” platforms, but also opportunities for competitor digital service providers and adjacent firms. Further, the proposed requirements of the DMA have the potential to transform the way that businesses engage with “gatekeeper” providers – including, for example, companies that sell goods and services, distribute apps, and/or purchase advertising on large platforms.

Digital Markets Act Proposal: Main Takeaways

  • Proposes new rules intended to promote fair and contestable markets in the digital sector, which would apply only to providers of “core platform services” designated as “gatekeepers”.
  • Defines “core platform services” to include online search engines, online marketplaces, social networks, messaging and chat apps, video-sharing platforms, operating systems, cloud computing services, and advertising networks and exchanges.
  • Defines “Gatekeepers” as providers of core platform services which have a significant impact on the EU internal market, serve as an important gateway for business users to reach customers, and have an entrenched and durable position.
  • Provides quantitative thresholds based on turnover or market value, and user reach, as a basis to identify presumed gatekeepers. Also empowers the Commission to designate companies as gatekeepers following a market investigation.
  • Prohibits gatekeepers from engaging in a number of practices deemed unfair, such as combining personal data across platforms, ‘wide’ MFN clauses, misusing non-public data about the activities of business users and their customers to gain a competitive advantage, blocking users from uninstalling pre-installed applications, self-preferencing in ranking, etc.
  • Imposes certain affirmative obligations on gatekeepers, including measures to promote interoperability, data access, data portability, and transparency regarding advertising services.
  • Requires gatekeepers to notify below-threshold mergers and to accept independent audits of profiling practices.
  • Puts the Commission in charge of enforcement with extensive investigative powers, including the power to require access to databases and algorithms, and the ability to impose fines of up to 10% of the gatekeeper’s worldwide annual turnover.
  • Empowers the Commission to impose structural remedies, potentially including the divestiture of businesses, for recurring non-compliance.
  • Authorizes the Commission to carry out market investigations to assess whether new gatekeeper practices and services need to be regulated.

Continue Reading Digital Markets Act: The European Commission Unveils Plans to Regulate Digital ‘Gatekeepers’

On December 18, 2020, the Ninth Circuit Court of Appeals held that “Oh, the Places You’ll Boldly Go!,” a Dr. Seuss and Star Trek mashup illustrated book, is not a fair use exempted from copyright liability. Under the Copyright Act of 1976, the factors courts assess in determining if there is fair use include:

  1. The

A proposed law issued by the People’s Republic of China (PRC) on October 21, 2020, the draft Personal Information Protection Law, seeks to impose restrictions on entities and individuals, including those operating outside of China, that collect and process personal data and sensitive information on subjects in China. The proposed law also provides for penalties

This article was originally published in Automotive World.

The future of the mobility is dependent on AI, but without greater understanding among consumers, trust could be hard to build.

The mobility sector is keen to realise the full benefits of artificial intelligence (AI), not least to open up the revenues which data-driven connected services could offer. But moving forward, it must balance these opportunities with the rights of drivers, passengers and pedestrians. A number of concerns have already surfaced, all of which will become more pressing as the technology is further embedded into vehicles, mobility services and infrastructure.

Privacy and liability are two of the major challenges. As Christian Theissen, Partner, White & Case explains, mobility has become inherently connected to consumer habits and behavioural patterns, much like the e-commerce and social media industries. “The access, ownership, storage and transmission of personal data, such as driving patterns, must be taken into consideration by both lawmakers and companies gathering and using data,” he says. Meanwhile, in a world of AI-powered self-driving, at what point do regulators start blaming the machine when something goes wrong?

Part of the challenge in considering these issues is that as things stand, there is limited understanding among consumers around what rights there are. “Consumers appreciate AI,” says Cheri Falvey, Partner, Crowell & Moring, “and in particular the ease with which navigational apps help guide them to their destination. Whether they appreciate how their data is accumulating and developing a record of their mobility patterns, and what their rights are in respect to that data, is another question.”

There is often little precedent for regulators to rely on when making new policy in this arena, so it’s a good time to create a proactive regulatory strategy that invites discussion and collaboration from the start

This is in part because it is not always clear when AI is at work. A driver may register when a car’s navigation system learns the way home, but won’t necessarily realise that data on how a car is driven is being collected for predictive maintenance purposes, or that their data is being fed into infrastructure networks to manage traffic flow.Continue Reading Automakers and Regulators Must Educate Consumers on Mobility AI

At 9:30 a.m. Central European Time, privacy professionals around the world were refreshing their browsers to read the long-awaited judgment of the Court of Justice of the European Union (CJEU) principally addressing the viability of Standard Contractual Clauses (SCCs) and the EU-U.S. Privacy Shield (Privacy Shield) as means to transfer personal data from the European Union (EU) to the United States (U.S.).

When the judgment arrived, it landed with a bang: though the CJEU upheld the use of SCCs, it invalidated the Privacy Shield, the well-known mechanism to transfer personal data from the EU to the U.S.  The decision also cast doubt on the viability of other options, including SCCs, for making transatlantic transfers.

The foundation of this decision and previous decisions affirming challenges to U.S. privacy practices is that the protection of personal data is a fundamental right in the EU, akin to a constitutional right in the U.S.  The General Data Protection Regulation (GDPR) enshrined these fundamental rights and established uniform data protection standards across the EU designed to protect the personal data of EU-based individuals.Continue Reading Privacy Shield Invalidated: EU Data Transfers to the U.S. under Siege (again…)

By clicking “Accept All Cookies”, you agree to the storing of cookies on your device to enhance site navigation, analyze site usage, and assist in our marketing efforts.

Accept All CookiesReject Nonessential Cookies