Countdown to Data Privacy Day 2024

January 22 - January 28, 2023

India’s Data Privacy Law: The Digital Personal Data Protection Act

January 30, 2024

By: Shannon A. Knapp and Victoria M. Okraszewski

Late last year, India passed the Digital Personal Data Protection Act (DPDPA) and joined the growing number of countries to pass a broad consumer protection law. After years of amendments and debates, the DPDPA will replace India’s current piecemeal data protection regime. The purpose of the DPDPA is to establish transparency and accountability regarding the collection and processing of personal data of India residents. Currently, the law is not in effect and will not be until the government finalizes and passes detailed rules required for implementation. However, it is expected that the DPDPA will go into effect sometime this year.

Who does DPDPA apply to?

The DPDPA applies to organizations that process personal data of India residents both within and outside of India. The extra-territorial reach of DPDPA applies when an organization located outside of India is engaged in the processing of personal data in connection with any activity related to the offering of goods or services to residents of India.

What does DPDPA do?

As with most consumer privacy laws, the DPDPA grants data principals (defined as the individuals to whom the personal data relates) the right to know, correct and erase their data. Personal data is broadly defined under the law to be any data about an individual who is identifiable by or in relation to such data. Such a broad definition of personal data means that many data points, even an email address, would be considered personal data under the act. Unlike the European Union Data Protection Regulation (GDPR), DPDPA does not have a distinction for sensitive personal data, meaning that all personal data is expected to be collected and processed in the same manner and with the same protections.

Of particular note, DPDPA does not restrict the transfer of personal data outside of India. However, DPDPA does enable the government of India to restrict the transfer of personal data to certain countries or territories outside India by way of a notification. It is unclear under the current version of the act what such transfer restrictions could be, but the implementing regulations may provide further clarity, as well as other government guidance.

Like GDPR and China’s Personal Information Protection Law, covered organizations are required to have legal bases for processing information. Covered organizations under DPDPA can process personal data for a lawful purpose for which individuals have consented or for certain “legitimate purposes” such as to respond to a medical emergency, or to fulfill the purpose for which the individuals have voluntarily provided the information and have not indicated that they do not consent to the use of their personal data.

Moreover, DPDPA places numerous additional obligations onto covered organizations including, but not limited to, providing notice to individuals concerning the entities’ collection and processing practices, designating data protection officers, preparing for and responding to data breaches and performing data protection impact assessments, along with other duties.

Notably, unlike GDPR and U.S. consumer privacy laws, the DPDPA places obligations on the data principal. The duties of the data principal include compliance with all applicable laws when exercising their rights, not impersonating another individual while providing personal data, not suppressing material information when providing personal data, not filing a false or frivolous grievance or complaint; and furnishing verifiable authentic information when exercising their right to correct or erase their personal data.

What are the enforcement mechanisms?

The soon-to-be-created Data Protection Board will have enforcement power over noncompliance and data breaches. Data breaches have steep fines under the act. Covered organizations that experience a data breach and are unable to fulfill required breach reporting obligations could see a fine up to RS250 crore (approximately $30 million U.S. dollars).

What does this mean for your organization?

Organizations who do business, recruit or employ residents of India should begin to assess their policies and procedures. Although the DPDPA shares many similarities with the GDPR and U.S. consumer privacy laws, there are notable differences, and thus, a gap analysis should be conducted to ensure compliance to all consumer privacy laws. For example, businesses should implement a policy that addresses the consent requirements for processing of India resident personal data.

Bond attorneys regularly assist and advise clients on an array of data privacy and cybersecurity matters. For more information regarding India’s Digital Personal Data Protection Act and to discuss compliance efforts businesses should be taking, contact Shannon Knapp, CIPP/US & CIPP/A, Victoria Okraszewski or any attorney in the cybersecurity and data privacy practice.

 

FTC Seeks Expansion of Children’s Privacy Protection Law

January 26, 2024

By: Jessica L. Copeland and Victoria Okraszewski

On Dec. 20, 2023, the Federal Trade Commission (FTC) published a Notice of Proposed Rulemaking (NPRM) to the Children’s Online Privacy Protection Act (COPPA). COPPA was enacted in 1998 and went into effect in 2000. Under COPPA, certain online entities must obtain parental consent and provide notice before the collection, use and disclosure of personal information from children under the age of 13. COPPA was last updated in 2013. Over a decade later, the FTC seeks expansion of the Act.

The proposed amendments make significant changes to COPPA, including additional restrictions on the use and disclosure of children’s personal information and further constraints on companies that monetize children’s data. The goal of the proposed amendments is to limit the collection and exploitation of children’s personal information and provide a secure digital environment for children to safely explore.

Proposed Amendments

  1.  Additional Opt-In Required for Targeted Ads
    Entities subject to COPPA must obtain separate verifiable parental consent to disclose information to third parties (including third-party advertisers) unless the disclosure is integral to the nature of the website or online service.
  2. Collection of Personal Data Cannot Be a Condition of Participation
    Covered entities are prohibited from collecting more personal information than reasonably necessary as a condition for a child to participate in a game, offering of a prize, or other activities. The FTC is considering adding language to clarify the meaning of “activities.”
  3. Limits on the Support for the Internal Operations Exception
    Currently, covered entities can collect persistent identifiers without first obtaining parental consent if: (1) the entity does not collect any other personal information; and (2) uses the persistent identifier solely to provide “support for the internal operations of the website or online service.” Under COPPA, “support for internal operations” includes activities necessary to maintain or analyze the functioning of a site or online service; authenticate users of or personalize content on the site or online service; serve contextual advertising or cap the frequency of advertising; protect the security or integrity of the user, site, or online service; ensure legal or regulatory compliance; and fulfill a request of a child. Additionally, under the proposed amendments, exempt entities must provide an online notice that states the specific internal operations for which it has collected a persistent identifier and how it will ensure that such identifier is not used or disclosed to contact a specific individual.
  4. Limits on Coercing Kids to Stay Online
    Covered entities are prohibited from using personal information collected under COPPA’s multiple contact and support for the internal operations exceptions to send push notifications to children with the intention of encouraging children to use their service more. Entities that use personal information collected from a child to prompt or encourage use of its service would also be required to flag such usage in its COPPA-required direct and online notices.
  5. Education-Based Technology
    The amended rule would allow schools and school districts to authorize education-based technology providers to collect, use and disclose students’ personal information but only for a school-authorized educational purpose and not for any commercial purpose.
  6. Increasing Accountability for Safe Harbor Programs
    To increase transparency and accountability, COPPA Safe Harbor programs must publicly disclose its membership lists and report additional information, such as the program’s business model to the FTC.
  7. Strengthening Data Security Requirements
    Covered entities must establish, implement and maintain a written children’s personal information security program that includes safeguards appropriate to the sensitivity of children’s data collected.
  8. Limits on Data Retention
    Under the amended rule, children’s personal information can only be retained for as long as the purpose it was collected for is fulfilled. In addition, covered entities are prohibited from retaining information for any secondary purpose and are required to establish and publicize a written data retention policy for children’s personal information.

Takeaways

The NPRM was published in the Federal Register on Jan. 11, 2024. Parties seeking to comment on the proposed changes to COPPA will have until March 11, 2024. In addition, on Jan. 18, 2024, the FTC held its open meeting of the Commission and presented the proposed changes to COPPA.

The FTC’s proposed amendment to COPPA signifies the increased attention in expanding privacy protections over individuals under the age of 18. Shortly after the NPRM announcement, U.S. Senators Bill Cassidy (R-LA) and Edward Markey (D-MA) vocalized their support for the proposed amendments. Specifically, they noted that Congress should enact the Children and Teens’ Online Privacy Protection Act, coined as COPPA 2.0, which would strengthen the protections of minors in relation to the online collection, use and disclosure of their personal information.

Moreover, several states have already initiated expansion of privacy protections for individuals under the age of 18. For example, the California Consumer Privacy Act requires covered entities to obtain consent before the sale of personal information of consumers under the age of 16. Additionally, Utah, Arkansas, Louisiana and Texas enacted laws that prohibit social media sites from allowing minors to use their services without parental consent. Most recently, New Jersey passed the New Jersey Data Privacy Act which prohibits the processing of personal data from consumers between the ages of 13 to 17 without consent.

Bond attorneys regularly assist and advise clients on an array of data privacy and cybersecurity matters, including compliance with COPPA and other privacy authorities. If you have any questions about COPPA or FTC privacy enforcement, please contact Jessica Copeland, CIPP/US, Victoria Okraszewski or any attorney in Bond's cybersecurity and data privacy practice.
 

New Jersey Becomes 13th State to Enact Consumer Privacy Law

January 25, 2024

By: Amber L. Lawyer, Shannon A. Knapp, and Victoria Okraszewski

On Jan. 16, 2024, the New Jersey Governor signed the New Jersey Data Privacy Act (the Act) into law, making New Jersey the 13th state to adopt a broad consumer protection law. While the Act follows the suit of many other comprehensive privacy laws, it is more expansive in many regards, particularly concerning its threshold for applicability, broader definitions than other state privacy laws, inclusion of required opt-out mechanisms and expansive child privacy protections. The law will take effect in January, 2025. Some of the most important aspects of the law are detailed below.

Organizations Covered

The Act will apply to entities and individuals that conduct business in New Jersey or produce products or services that are targeted to New Jersey consumers, and that during the preceding calendar year either:

  1. Control or process the personal data of at least 100,000 consumers, excluding personal data processed solely for the purpose of completing a payment transaction; or
  2. Control or process the personal data of at least 25,000 consumers and the controller derives revenue or receives a discount on the price of any goods or services, from the sale of personal data.

Notably, similar to Colorado and Texas, the law does not include a revenue threshold.

Definitions

“Consumer” is broadly defined by the Act as a New Jersey resident acting in an individual, job seeking or household context. Unlike the CCPA and GDPR, this definition does not apply to individuals acting in the employment context.

Adding more nuance, the Act expands the definition of "sensitive data" beyond many other consumer privacy laws to include financial information. This information includes consumer’s account number, account log-in credentials, financial account, or credit or debit card number, in combination with any required security code, access code or password that would permit access to a consumer’s financial account. This inclusion is significant, as the Act requires covered entities to obtain consent prior to processing and collecting sensitive data.

Consumer Rights

The Act grants New Jersey consumers broad rights concerning their personal data, which are similar to the consumer rights found under many of the other state consumer privacy laws. These consumer privacy rights include the right to know and access personal data, correct inaccuracies in the consumer’s personal data, delete personal data, obtain a copy of the data and opt out of the processing of personal data for (a) targeted advertising, (b) the sale of personal data or (c) profiling in furtherance of decisions that produce legal or similarly significant effects concerning the consumer.

Other Requirements for Covered Entities.

Covered entities are required to publish a privacy notice that must contain certain information regarding an entity’s collection, use, disclosure and retention practice. Further, following in the footsteps of Colorado, Connecticut, Montana, Oregon, Delaware and Texas, the Act requires covered entities to recognize Universal Opt-Out Methods (UOOMs) within six months of the Act’s effective date. These UOOMs apply to targeted advertising and the sale of personal data. UOOMs are a mechanism by which consumers can exercise their right to “opt out” of a platform processing their personal data for certain purposes by sending a signal indicating the consumer’s opt out preferences.

In addition, the Act requires covered entities to conduct data protection assessments if there is a heightened risk of harm before conducting such processing. Activities that may have such heightened risk including certain targeted advertising practices, sale of personal data or processing sensitive information.

Children’s Data

The Act specifically prohibits the processing the personal data of consumers between the ages of 13 and 17 for certain activities, such as targeted advertising, without proper consent. This provision applies if the covered entity has actual knowledge or willfully disregards that the consumer is in this age group. This provision is similar to provisions regarding children’s data in Delaware and Oregon privacy laws. However, New Jersey takes it a step further by including an affirmative opt-in consent requirement and provides protection for individuals in a larger age range. Such expansive requirements around children’s data demonstrate legislatures increased focused on children’s privacy.

Exemptions

The exemptions included in the Act are much narrower than those in other privacy laws. Notably, the Act applies to nonprofit organizations that meet the threshold requirements detailed above. Additionally, the Act does not provide exemptions for institutions of higher education nor any exemption for information subject to the Family Educational Rights and Privacy Act (FERPA).

The Act exempts certain types of data and entities, such as personal health information as defined under the Health Insurance Portability and Accountability Act (HIPAA) and the Health Information Technology for Economic and Clinical Health Act (HITECH), Gramm-Leach Bliley Act (GLBA) financial institutions and some state agencies.

Enforcement

The Act does not contain any private right of action for consumers and will be exclusively enforced by the New Jersey Attorney General. Covered entities will have a 30-day cure period to remedy violations of the law for the first 18 months the law is effective. Similar to California, the Act also requires the Division of Consumer Affairs to issue regulations to effectuate the intent of the Act.

While there is much overlap between the Act and other consumer privacy laws, there are also notable differences that will affect covered entities’ compliance obligations. Therefore, during the period between the enactment and effective date, businesses subject to the Act should begin to assess current data collection and processing activities as well as any internal and public-facing policies.

Bond attorneys regularly assist and advise clients on an array of data privacy and cybersecurity matters. Please contact Amber Lawyer, CIPP/US & CIPP/E, Shannon Knapp, CIPP/US & CIPP/A, Victoria Okraszewski or any attorney in Bond's cybersecurity and data privacy practice if you have questions regarding the implementation of the New Jersey Protection Act and its impact on your business.
 

The Impact of Merck’s NotPetya Policy Claims and a Reported Settlement: The Cyber Insurance Pendulum Swings, Again

January 24, 2024

By: Gabriel S. Oberfield and Victoria Okraszewski

By definition, insurance policies represent an exercise in planning for (and hedging against) catastrophe. Cyber insurance for the healthcare industry is no exception. But any hedge is only as good as it’s reliable. Many healthcare providers have been left high and dry when seeking to collect on their policies – finding their claims ‘carved out’ by exclusionary language. This swinging pendulum may be heading in the other direction, however, if a recent reported settlement is any indication.

Merck Settlement

Merck & Co. (Merck), a pharmaceutical company, was a 2017 target of the global NotPetya attack. Merck alleged $1.4 billion in damages. Involved cyber insurers tried to avoid payouts by relying on policy exclusions relating to war. A New Jersey appellate court upheld a lower court ruling that such exclusionary language did not apply to cyberwarfare, which it distinguished from physical warfare. Before the New Jersey Supreme Court was set to hear oral arguments, Merck and several insurers recently reached a confidential settlement concerning the alleged damages (reported here). This may indicate a sea change in policy construction – one that providers inking policies should approach with eyes open, and that insurers likewise will approach with due care.

The Evolution of the Cyber Insurance Marketplace

Prior to 2017, cyber insurance was still emerging, and polices were comparatively less prevalent. Indeed, many insurers were competing for business, driving down the cost of cyber insurance. Within the last five years, however, there has been an increase in ransomware attacks, causing an uptick in the need for cyber insurance. Simultaneously, losses escalated, and insurers began to implement more stringent standards while charging higher premiums. This caused various entities to institute better ‘cyber-hygiene’ through tactics such as multifactor authentication (MFA) endpoint detection and response (EDR).  (For more information on this dynamic, please see this 2023 presentation delivered by Gabriel Oberfield, one of this piece’s co-authors.)

Indeed, insurers are requiring companies to implement and maintain specific security controls that comply with the evolving landscape, such as the National Institute of Standards and Technology (NIST) framework. Moreover, insurers now require details regarding a company’s information security practices, including whether companies have MFA and EDR in place, and some insurers have begun mandating ongoing cybersecurity awareness trainings on the premise that when employees are informed, it mitigates risks and lessens any downtime an attack may cause.

Disputes concerning implementation of safeguards are far from unusual. For instance, at least one notable university sued Lloyd’s of London for the expenses related to a breach that exposed the personal information of patients at the university's health facilities. According to Lloyd’s of London, the university, which it insured, failed to comply with data security provisions under its policies.

The Pendulum Swings, Again

Whether driven by the Merck settlement or otherwise reading the writing on the wall, some insurers have begun to exclude coverage for the effects of cyberwarfare, including but not limited to state-sponsored attacks. According to reports, Lloyd’s of London has instituted numerous such exclusions.

Where Attorneys Can Help

As the pendulum sways, it is important for policyholders carefully to consider and negotiate the reach of exclusionary language during cybersecurity policy binding and renewal periods.

Bond attorneys regularly assist and advise clients on an array of data privacy and cybersecurity matters, including in the cyber insurance space. If you have any questions about this memo, please contact Gabriel Oberfield, Victoria Okraszewski or any attorney in Bond's cybersecurity and data privacy practice.

ChatGPT – Hallucinating Case Law, Instigating Attorney Sanctions and Stealing Privilege – Oh My!

January 23, 2024

By: Jessica L. Copeland and Jackson K. Somes

In 2023, use of generative artificial intelligence (Gen AI) tools such as ChatGPT went viral. Generative AI platforms can be utilized to create a host of efficiencies across businesses and professions such as drafting emails, summarizing meetings or writing or debugging lines of code. However, the widespread adoption of ChatGPT and other Gen AI resources quickly revealed the inherent risk embedded in these platforms. Chief among those risks for attorneys are ethical and privacy concerns based on how these platforms are exploited.

Ethical Concerns

The meteoric rise of open source Gen AI tools (e.g., Google’s Bard, ChatGPT, Dall-E, etc.) exposed the risks that certain users may face when trusting these tools. Concerningly, a rapidly increasing number of lawyers have turned to ChatGPT and the like to perform their legal research and writing tasks. By now, most readers of this post have not only heard of the two New York attorneys sanctioned for filing legal briefs created by ChatGPT, that cited “hallucinated” court cases to support their argument. In addition, many may also tracking the latest and greatest on Michael Cohen – Donald Trump’s former lawyer, who similarly may face sanctions for citing to “hallucinated cases” he found using Google’s Bard. These are prime illustrations of why a user, especially an officer of the court, should not blindly rely on the material generated by these Gen AI tools. 

This spur of sanctionable attorney activity has prompted several state bars, courts and individual judges to issue AI guidelines. Particular to an attorney’s ethical obligations, California’s guidelines provide that attorneys should consider disclosure to their clients when using Gen AI to create work product.[1]

Privacy Concerns

Users of Gen AI tools should be just as concerned with the information entered into Gen AI tools as the information generated by the platforms. For reference, ChatGPT and similar open source platforms, are built on large language models (LLMs) that use machine-learning algorithms to enable users to have a human-like conversation with the platform. Generally, this means that these tools will retain the input information to help it generate responses to future prompts. These Gen AI models are sometimes referred to as “black boxes” because users cannot determine exactly how the tool generated a response or what information the tool relied upon to provide the response. This creates glaring privacy concerns as the platform will store the information entered by a user and potentially provide that same information to another user.

The privacy policy posted on OpenAI’s website, provides that “When you use our Services, we collect Personal Information that is included in the input, file uploads, or feedback that you provide to our Services.” Simply stated, there is no guarantee that information entered into ChatGPT will be treated as private or confidential.

With that, entering confidential information creates risk of an inadvertent disclosure, waiver of legally protected communications or violation of confidentiality. That means that an attorney entering client information to help generate legal arguments most likely waives attorney-client privilege over certain communications; or a school administrator entering student records could violate Family Educational Rights and Privacy Act (FERPA) privacy rules; or casual use by a company employee may lead to the inadvertent disclosure of a trade secret or other confidential or sensitive information.

Indeed, Chief Justice John Roberts in the Supreme Court’s Year-End Report recognized that “any use of AI requires caution and humility. Some legal scholars have raised concerns about whether entering confidential information into an AI tool might compromise later attempts to invoke legal privilege.”

As a general rule, users should avoid entering any privileged or protected information into Gen AI tools. Organizations should adopt policies regarding the risks and usage of Gen AI platforms to avoid any inadvertent disclosures by employees. When using a Gen AI tool, it is important to keep in mind that anything entered by the user may be shared with others.

Bond attorneys regularly assist and advise clients on drafting data privacy and cybersecurity policies. For more information regarding data privacy matters, please contact Jessica Copeland, Jackson Somes or any attorney in Bond’s cybersecurity and data privacy practice.


[1] A follow up information memo will provide more detail on state and federal court guidelines and/or rules related to use of Gen AI tools.
 

Countdown to Data Privacy Day 2024

January 18, 2023

By: Cybersecurity and Data Privacy Practice

Data Privacy Day is January 28. First recognized in 2007, Data Privacy Day is an international effort to raise awareness about data privacy and to encourage the protection of personal information online. Every year, Bond counts down to Data Privacy Day with a targeted series of privacy-related articles that span a variety of practice areas and disciplines. This year, Bond will be exploring the following topics listed below.

  • Assessing Privacy Risks When Using Artificial Intelligence Tools: Curt Johnson and Jackson Somes will discuss the rapid rise of generative Artificial Intelligence (AI) and the numerous privacy risks associated with the technology. This article will guide readers through common privacy pitfalls while encouraging responsible use of generative AI tools like ChatGPT.
  • The EU AI Act – A Privacy Analysis: Mario Ayoub will discuss the recently enacted EU AI Act and the key privacy principles that inform the Act’s requirements. This article will also consider how the Act could impact future U.S. AI regulation from a privacy perspective.
  • The New Jersey Privacy Act: Amber Lawyer, Shannon Knapp and Victoria Okraszewski will preview the recently enacted New Jersey Privacy Act. This article will discuss the Act’s main requirements and explain how the Act differs from existing state privacy law.
  • FTC Proposes Strengthening Children’s Privacy Rule: Jessica Copeland, Mario Ayoub and Victoria Okraszewski will explore a recent FTC Notice of Proposed Rulemaking to the Children’s Online Privacy Protection Act. This article will walk through the main proposed changes and explain how they may impact businesses that process children’s data.
  • India’s Digital Personal Data Protection Act: Shannon Knapp will provide an important update on a significant newcomer to the international privacy compliance space. This article will summarize the key requirements introduced by India’s new privacy act and explain how it could affect U.S. businesses.
  • Cyber Insurance in Healthcare: Gabriel Oberfield and Victoria Okraszewski will report on the current state of the cyber insurance market in health care. This article will explain how the cyber insurance market is evolving in response to the rapidly changing security landscape and what this means for the healthcare industry.
  • What's On the Horizon – 2024 State and Federal Data Privacy Legislation: Amber Lawyer and Mario Ayoub will preview the upcoming year in the data privacy world and identify trends across new legislation. 

If you'd like more information or if you have any questions, contact any attorney in Bond's cybersecurity and data privacy practice.