Countdown to Data Privacy Day 2023

January 18 - January 28, 2023

What's On the Horizon: 2023 State and Federal Data Privacy Legislation

By: Amber L. Lawyer and Maureen H. Milmoe

The United States is gearing up for another noteworthy year in data privacy and cybersecurity. 2023 will likely be a year of transition as certain data privacy laws come into effect. As concerns around data protection and cybersecurity continue to persist, we expect more state and federal legislative action. Along with enhanced legislation, data privacy enforcement and regulatory action will likely increase in the U.S. throughout the year.

1. New Data Privacy Laws

Five states now have comprehensive consumer privacy laws: California (CCPA and CPRA), Colorado (CPA), Connecticut (CTDPA), Utah (UCPA) and Virginia (VCDPA). As discussed below, more than 30 other states have also considered data privacy legislation, and a few will likely pass in the next year.

California
As of Jan. 1, 2023, consumers have new rights afforded by the California Privacy Rights Act (CPRA), which is a recent amendment to the California Consumer Privacy Act of 2018 (CCPA). Such rights provide consumers the right to correct inaccurate personal information that a business has about them, and the right to limit the use and disclosure of sensitive personal information collected about them. Beginning July 1, 2023, the California Privacy Protection Agency will be enforcing the CCPA through administrative enforcement actions. For more information on CCPA and CPRA, you can read our blog post here.

Colorado
Taking effect on July 1, 2023, the Colorado Privacy Act (CPA) provides protections for personal data held by entities that do business in Colorado or target Colorado residents. Notably, covered entities include profit and nonprofit businesses that qualify for CPA compliance. Currently, the Colorado Attorney General has released two drafts of the proposed CPA rules and has a formal rulemaking hearing scheduled for later this year. We expect formal CPA rules will be finalized within the next few months. For more information on CPA, you can read our blog post here.

Connecticut
Beginning on July 1, 2023, under the Connecticut Data Privacy Act (CTDPA), Connecticut residents will have certain rights over their personal data and covered entities will have new responsibilities and privacy protection standards. Notably, the CTDPA requires covered entities to ask for consent before processing data from a Connecticut resident under the age of 18. For more information on CTDPA, you can read our blog post here.

Utah
Late this year, on Dec. 31, 2023, the Utah Consumer Privacy Act (UCPA) will take effect. Unlike other state laws, the UCPA applies only to companies with annual revenue of at least $25 million among other criteria. While the scope of the UCPA is narrower compared to other state privacy laws, future amendments are a possibility. For more information on UCPA, you can read our blog post here.

Virginia
As of Jan. 1, 2023, Virginia enacted the Virginia Consumer Data Protection Act (VCDPA). A comprehensive consumer data privacy law mirrored off of CCPA, the VCDPA provides protections for personal data held by entities that do business in Virginia or target Virginia residents. For more information on VCDPA, you can read our blog post here.

2. Proposed Data Privacy Laws

Nine states have already proposed new comprehensive consumer privacy laws: Iowa, Indiana, Kentucky, Massachusetts, Mississippi, New Jersey, New York, Oklahoma, Oregon and Tennessee. While it remains to be seen if these proposed bills will be enacted, these laws could create an array of new consumer privacy rights and business obligations.

Similar to the five new data privacy laws, the proposed legislation will require many companies to reassess their collection and use of personal information and modify their business practices accordingly. We continue to stay informed on the changing state privacy landscape and will update our guidance to you as new data privacy legislation is enacted.

3. Expected Federal Legislative Action

A patchwork of state privacy laws has prompted federal legislators to propose the American Data Privacy and Protection Act (ADPPA), aiming to provide a uniform approach to data privacy. The ADDPA is largely consistent with the framework of various state privacy laws, as well as the European Union's General Data Protection Regulation (GDPR). Similarities include numerous individual privacy rights, including rights to access, delete and correct data, as well as the right to data portability.

Although the ADDPA has yet to pass, the bill has pushed the U.S. closer to enacting a federal data privacy law. Due to federal legislators increased interest in data privacy, regardless of whether the ADPPA passes, there's a strong likelihood of a comprehensive federal privacy bill passing soon.

For more information or guidance concerning any of the topics above, please contact Amber Lawyer, CIPP/US & CIPP/E, Maureen Milmoe or any Bond attorneys in the cybersecurity and data privacy practice.
 

What’s on the Global Horizon for Data Privacy in 2023?

By: Amber L. Lawyer, Shannon A. Knapp and Jackson K. Somes

Expect another year of regulatory ambiguity for international data privacy laws in 2023, as the European Commission reviews the EU-US Data Privacy Framework. European Union courts indicate increased scrutiny for behavioral advertising, and a host of new privacy laws are expected across the globe.

The EU-US Data Privacy Framework

On Oct. 7, 2022, President Biden signed an executive order implementing the EU-US Data Privacy Framework (EU-US DPF). The EU-US DPF provides an mechanism for the transfer of data across EU and US borders. As many readers know, the Court of Justice of the European Union (CJEU) previously invalidated the prior data transfer scheme, the EU-US Privacy Shield, because it did not provide an adequate level of privacy protection as required by GDPR.

The EU-US DPF still has to be reviewed by the European Commission before it is implemented. The Biden Administration announced that the updated provisions in the new data transfer framework fully address the concerns raised by the CJEU when it invalidated the Privacy Shield, but the European Commission will be the one to determine if the EU-DPF meets adequate data privacy standards. Once the European Commission issues an adequacy determination, the EU-US DPF Principals will become immediately effective. Although a determination by the European Commission is expected in 2023, legal challenges to the new framework are also expected to shortly follow.

Validity of Contractual Necessity as a Legal Basis for Behavioral Ads Under GDPR in Question

A recent decision issued by Ireland’s Data Protection Commission signaled that EU regulators are closely examining lawful bases for processing user information in connection with behavioral advertising.

At the start of the new year, Ireland’s Data Protection Commission (DPC) fined Facebook’s parent company Meta €390M. In its final decision, the DPC held that Meta did not have a lawful basis under GDPR for processing the personal information of its users for targeted behavioral advertising. Meta asserted that it had a proper contractual basis to process the user data for personalized advertising, as this processing was disclosed in Terms of Services of Facebook and Instagram. However, the DPC declared that the processing of personal data for the purpose of behavioral advertising is not a necessary core element of Meta’s services. Instead, DPC determined that Facebook’s and Instagram’s main purpose is communication with other users.

Although the decision is not an outright ban on behavioral advertising, it does reveal that EU regulators will closely investigate a company’s claimed lawful basis for processing personal data. This ruling could have an important impact on companies with behavioral advertising at the center of their business models.

Going forward, companies across industries should also review their bases for processing personal information, especially operations based on contract, to ensure continued compliance with GDPR.

New International Data Privacy Laws

Around the world, several new international data privacy laws are expected in 2023. India’s first data protection bill is scheduled to pass in the summer of 2023, Canada is expected to overhaul its national privacy laws with the Digital Charter Implementation Act, and the EU Digital Markets Act will take effect in May 2023.

Canada

Canada’s Digital Charter Implementation Act, Bill C-27, was introduced in 2022 and expected to become law in 2023 without many substantive changes. The act seeks to amend an existing data privacy law as well as enact the Consumer Privacy Protection Act (CPPA), Personal Information and Data Protection Tribunal Act, and the Artificial Intelligence and Data Act.

Under the CPPA, organizations will generally need a user’s consent to collect the user’s personal information. However, there are a number of exceptions to the consent requirement, including an exception for “legitimate interests” in conducting business activities. Other key provisions on the CPPA include the requirement to keep personal information anonymized, the right for an individual to request an organization dispose of their personal information and an individual portability right regarding personal information.

The CPPA also creates an enforcement regime for issuing compliance orders and imposing penalties for violations of Canada’s privacy laws. The scheme grants new powers to Canada’s Office of the Privacy Commissioner, including the ability to assess an organization’s privacy program and issue recommended corrective measures.

India

India’s long-awaited Digital Personal Data Protection Bill is also expected to be enacted in 2023. The proposed legislation is expected to provide a comprehensive legal framework for data privacy in India, outlining the rights and duties of citizens processing personal data.

A version of the Digital Personal Data Protection Bill has been in the works in India since 2018. The current iteration borrows from other existing data privacy frameworks. For example, any organization that processes personal information, called a data fiduciary, will have to provide notice of the data collected and the purpose for its collection, and also obtain consent for specific purpose that the personal information is used. The proposed law will also establish affirmative individual rights such as the right for a person to obtain their collected data and the right to correct any inaccurate or misleading personal data.

EU Digital Markets Act

The EU Digital Markets Act imposes a broad range of prohibitions and obligations on entities determined to be “gatekeepers.” Many of the prohibitions and obligations imposed by the Digital Markets Act involve the use of a user’s personal data in addition to addressing anti-competitive practices. A company is presumed to be a gatekeeper if it meets a three-part definition: (1) the company provides a core platform service that serves as an important gateway for business users to reach end users; (2) the company has a significant impact on the internal EU market; and (3) the company enjoys an established or expected entrenched durable position.

Beginning May 2023, the European Commission (EC) will begin the process to designate which companies qualify as gatekeepers. A company presumed to be a gatekeeper by the EC will have the opportunity to rebut the presumption by substantively showing that it does not meet the criteria. The act is expected to primarily target Big Tech companies.

The need for data privacy continues to be recognized across the globe, and the progression toward greater privacy and data-related laws is only gaining speed. Check back in tomorrow for our update on U.S. privacy law.

For more information or guidance concerning any of the topics above, please contact Amber Lawyer, CIPP/US & CIPP/E, Shannon Knapp, CIPP/US, Jackson Somes or any attorney in Bond’s cybersecurity and data privacy practice.


Federal Guidance Hints at Robust Disclosure Requirements for use of Artificial Intelligence

By: Fred J. M. Price and Mario F. Ayoub

Once a technology reserved for science fiction and fantasy, artificial intelligence (AI) now permeates almost every industry. In its most basic form, AI harnesses computer processing power, proprietary algorithms and large datasets to perceive and synthesize data in order to make decisions. The technology’s applications are endless. AI is leveraged in facial recognition software, autonomous vehicles, credit card fraud detection and even video game development. Other applications may be less obvious but have more serious implications. AI-assisted automated decision-making software is used to make hiring decisions, approve loans, extend credit, administer social services, profile individuals in law enforcement contexts, predict recidivism rates and profile consumers.

Despite AI’s influence in many sensitive areas such as housing, law enforcement, education and social services, many of the algorithms used to inform automated decision-making systems are proprietary. This means that both the public and regulators have a very limited understanding of how these systems work and what data is used to inform decisions. While protection for a developer’s proprietary algorithms spurs innovation and generates profit, the “black box” nature of these systems presents a significant challenge to regulators faced with combating algorithmic bias, discrimination and error.

Current Patchwork of State Laws

California, Colorado (effective July 2023), Virginia and Connecticut (effective July 2023), have all passed legislation governing the use of AI. Only one state requires disclosures regarding how AI is used to make decisions. Specifically, California’s CPRA charges the new California Privacy Protection Agency with adopting regulations “governing access and opt-out rights with respect to businesses’ use of automated decision-making technology,” including public disclosure about the logic involved.

The remaining states provide opt-out rights, but regulations may add disclosure rights in the future. Colorado and Virginia will allow consumers to opt-out of “profiling” which is defined as “any form of automated processing performed on personal data” where such profiling is in furtherance of decision-making pertaining to housing, financial services, healthcare, criminal justice, employment and other sensitive areas. Similar to the GDPR’s approach, Connecticut differs slightly by granting these opt-out rights only when decisions are made solely using automated processing.

Early Stages of U.S. Federal Regulation

In late 2022, the federal government turned its attention toward AI’s role in making decisions, especially in more sensitive contexts where disparate impacts may arise. On August 22, 2022, the FTC published an Advance Notice of Proposed Rulemaking on Commercial Surveillance and Data Security regarding Commercial Security (the “ANPR”) “to request public comment on the prevalence of commercial surveillance and data security practices that harm consumers.” In October 2022, the White House released a Blueprint for an AI Bill of Rights (the “Blueprint”) “to help provide guidance whenever automated systems can meaningfully impact the public’s rights, opportunities or access to critical needs.”

Like California’s CPRA, a primary focus in both the ANPR and Blueprint is the importance of transparency and disclosure. Citing the increased adoption of AI in decision-making models, the ANRP warns of new mechanisms for discrimination. The FTC posits whether new rules should require companies to disclose (1) data usage, (2) collection, retention, disclosure and transmission practices, (3) the use of automated decision-making systems to analyze or process data, (4) how they use data to make decisions and (5) their reliance on third-party decision-making tools, among other disclosures.

The Blueprint aligns with the ANRP’s focus on transparency. One of the Blueprint’s five core principles is “Notice and Explanation,” which states that individuals should know when an automated system is being used to make decisions and how and why it contributes to outcomes that impact them. Both developers and deployers of AI systems should disclose “generally accessible plain language documentation including clear descriptions of the overall system functioning and the role automation plays, notice that such systems are in use, the individual or organization responsible for the system, and explanations of outcomes that are clear, timely and accessible.”

While notice, explanation and other disclosure requirements may offer some protection to consumers subject to automated decision-making, developers may be concerned that these requirements will put their proprietary information in jeopardy. While some large companies such as Amazon, Google and Meta, are beginning to embrace an opensource approach to AI development, many smaller companies still rely on the proprietary nature of their systems to generate profit and distinguish themselves in the market. Disclosure requirements, however, do not necessarily spell the end for this business model providing that companies plan carefully for how they will convey information to the public and regulators.

Preparing for Compliance

While a comprehensive AI regulatory framework is still likely a long way off, organizations that develop automated decision-making software should begin crafting statements now that (1) satisfy potential disclosure requirements suggested by both the ANRP and Blueprint and (2) offer sufficient protection for proprietary algorithms and related intellectual property. Additionally, companies that rely on service providers to process data using automated means will also likely have disclosure obligations stemming from a federal framework. These companies should reach out to their services providers now to request documentation regarding how data is processed and how AI impacts decision-making. Organizations that process data using AI technology may wish to consider the following compliance tips when crafting notice, use and other similar disclosures:

  • Use plain language and avoid technical terms when describing automated systems. The disclosures must be easy to understand without a technical background.
  • Any disclosure language should identify the entities responsible for designing, maintaining and using each component of the system.
  • Notice language should be available prior to processing and updated regularly to account for any changes to the system.
  • Keep explanations brief. To date, available guidance does not require a granular explanation that reveals proprietary information. While this may change, any publicly available explanations should be drafted to protect sensitive and proprietary information. Focus on topics such as the quality of the datasets, accuracy of the model and the effects of the model on the public. Regulators are less concerned with how a model works and more concerned with who it impacts.
  • Avoid charging the development team with drafting disclosure language. The development team should work closely with legal or compliance professionals to make sure any publicly available descriptions are easy to understand and devoid of sensitive or proprietary information.

Bond attorneys regularly assist and advise clients on an array of data privacy and intellectual property matters. If you have any questions about artificial intelligence, FTC compliance or IP-related issues, please contact Fred Price, Mario Ayoub or any attorney in Bond's intellectual property or cybersecurity and data privacy practices.


Fortnite Skinned: Fined $520 Million by the FTC for Privacy Violations

By: Jessica L. Copeland and Mario F. Ayoub

On Dec. 19, 2022, Epic Games, the developer of popular video game Fortnite, agreed to pay more than $520 million to settle Federal Trade Commission (FTC) claims that alleged a violation of the Children’s Online Privacy Protection Act (COPPA) and the deployment of “dark patterns” used to trick players into making in game purchase. The settlement amount is divided into two parts. Epic will pay a $275 million penalty to the FTC for privacy violations and $245 million in refunds to users tricked into making unwanted purchases.

Released in July 2017, the colorful survival shooter quickly became a household name and currently boasts 400 million active players. Children make up a substantial portion of the player base. A 2019 survey revealed that a staggering 53% of U.S. children aged 10-12 played Fortnite weekly, compared to 33% of teens aged 13-17 and 19% of young adults aged 18-24.

The game is as profitable as it is popular. Despite purporting to be “free to play,” Fortnite reported more than $9 billion dollars in revenue in the game’s first two years alone. Fortnite earns revenue through in-game microtransactions for cosmetic items such as character costumes. Players cannot purchase anything that would give them a competitive advantage or increase their odds of winning.

Privacy Violations

Under COPPA, it is “unlawful for any operator of a website or online service directed to children, or any operator that has actual knowledge that it is collecting or maintaining personal information from a child, to collect personal information.” In a federal court complaint, the FTC alleged that Epic violated COPPA by collecting personal information from players under the age of 13 without obtaining verifiable parental consent.

The FTC determined that there was overwhelming evidence to suggest that Fortnite targeted children. Through market research, Epic understood a majority of its player base was under the age of 13. The developer executed various marketing and licensing deals for children’s toys and Halloween costumes and hosted live events featuring celebrities popular with children. The FTC also highlighted intercompany communications clearly revealing Epic’s focus: “Agree with the idea that, generally, all theming should be relevant to [an 8 to 10 year old] as a litmus test.”

Despite Epic’s clear targeting of young consumers, the developer failed to establish a mechanism for obtaining parental consent for players under 13. While the FTC did acknowledge that Epic eventually started requiring parental consent, the developer still did not obtain consent for accounts created on Xbox and PlayStation consoles. Moreover, Epic made limited efforts to retroactively correct consent issues with existing accounts.

In addition, the FTC alleged that Fortnite’s always-on text and voice chat harmed minors. Adult players had unfettered access to chatting with minors, with no way for parents to limit their child’s exposure. The FTC found evidence that children were bullied, threatened, harassed and exposed to sensitive and adult content while playing Fortnite. Shortly after the game launched, Epic employees attempted to bring this issue to the company’s attention, but these complaints were largely ignored. Epic did eventually add privacy controls to turn chat functions off, but the FTC found they were not easily accessible and buried in the settings menu.

Dark Patterns

In a separate administrative complaint, the FTC alleged that Epic tricked players into making unwanted purchases by employing dark patterns. Put simply, dark patterns are interfaces designed to trick the user into doing things the user does not intend. For example, the FTC alleged that Fortnite’s counterintuitive menu layout and button configuration caused players to make accidental purchases with one click. Players could be charged while attempting to wake the game from sleep mode or while the game was loading. Additionally—and similar to claims brought against Amazon, Apple and Google—Fortnite allowed children to continually reauthorize their parents’ credit cards without their parents’ consent. These practices generated hundreds of millions of dollars in unauthorized charges.

Key Takeaways

Through this settlement, the FTC sent a strong signal that it will be closely monitoring commercial marketing activity targeting children. This is consistent with the FTC’s recent activity in the privacy and commercial surveillance space. On Aug. 11, 2022, the FTC published an Advance Notice of Proposed Rulemaking to “ask the public to weigh in on whether new rules are needed to protect people’s privacy and information in the commercial surveillance economy.” The notice focuses on the need to bolster children’s privacy online and the importance of obtaining informed consent, among other topics. Organizations looking to avoid regulatory attention should consider the following compliance tips:

  • Evaluate your organization’s customer base and determine whether your organization’s website markets toward children. Does the website sell children’s products or contain branding from popular children’s franchises?
  • Listen to your employees. Often, your employees have the most accurate understanding of your brand, website and products. Check in with them often and carefully consider any issues they bring to your attention.
  • Disclaimers are not enough. Claiming that your organization complies with COPPA or that your website is not intended for children will not deter regulatory scrutiny. Here, the FTC conducted a detailed factual inquiry that considered Epic’s practices and actions. A privacy statement that contains the right language is insufficient without actual compliance.
  • Evaluate your website’s interface. How many clicks does it take to make a purchase? Are key privacy controls difficult to locate? Does the website provide notice to consumers regarding the collection of payment information and an opportunity to consent? Does your website ask consumers to reauthorize payment information after each purchase? Are there adequate checks to prevent children from accessing stored credit and debit cards?

Bond attorneys regularly assist and advise clients on an array of data privacy and cybersecurity matters, including compliance with COPPA and other privacy authorities. If you have any questions about COPPA, or the FTC privacy enforcement, please contact Jessica Copeland, CIPP/US, Mario Ayoub or any attorney in Bond's cybersecurity and data privacy practice.


Data Privacy Laws Zeroing in on Employees’ Rights

By: Gianelle Duby, Amber Lawyer and Shannon Knapp

Data privacy continues to be a primary focus of several recently enacted or amended state laws. Employees’ right to privacy is of particular interest in light of the various ways personnel can maintain connectivity to the digital world throughout the workday. The ease of which applications and devices track, collect and process personal information warrants the need to establish clearly defined rules for acceptable employment practices that comply with applicable data privacy laws. In recent years, there has been a notable expansion of employee data privacy rights with the proliferation of employment-related privacy laws and increased recognition of employee privacy by companies. This is especially evident through the enactment of state employee monitoring laws, the expiration of the employer exemption under the California Consumer Privacy Act (CCPA) and the implementation of employee specific privacy policies.

Employee Monitoring

States are trending towards increasing transparency and privacy in the workplace by passing laws that require employers to notify employees if they are monitoring them.

In May of 2022, New York became one of the latest states to recognize the importance of employee privacy through the enactment of its employee monitoring law. The law requires private employers with a place of business in New York to provide employees with written notice if the employer monitors or intercepts employee emails, internet access or usage, or telephone conversations. The written notice must communicate that “any and all telephone conversations or transmissions, electronic mail or transmissions, or internet access or usage by an employee by any electronic device or system . . . may be subject to monitoring at any and all times by any lawful means.” For information on New York’s electronic monitoring law, you can read our prior blog post here.

New York followed the lead of Connecticut and Delaware, both of which have enacted similar employee monitoring laws. Along the same lines, in Texas, employer monitoring of employee electronic communications is considered an invasion of privacy. An employer may monitor its own phone system in order to ensure that employees are using the system for its intended purposes. However, employers must inform employees that this monitoring may be taking place.

Based on these efforts, it is likely that other states will follow the trend and pass legislation that seeks to limit and/or require notice to employees of monitoring activities taking place in the workplace.

Employee Data Under the CCPA

Since California enacted the CCPA in 2018, employers have been taking note to see how the law will apply to data collected and maintained about their employees. Until now, employment data has been specifically exempt from most of the CCPA’s requirements. However, as a result of amendments to the CCPA contained in the California Privacy Rights Act (CPRA) that went into effect Jan. 1, 2023, many categories of employee data are now subject to the CCPA’s requirements. Employers will now have to comply with certain obligations with respect to processing employee data.

Beginning in 2023, the CCPA broadly applies to employee data. Employee data will now be treated as any other commercial information, and covered employers will need to add employee and human resources data to their ongoing compliance efforts. In the employment context, personal information could include an employee’s contact information, insurance and benefits elections, bank and direct deposit information, emergency contacts, dependents, resume and employment history, performance evaluations, wage statements, time punch records, stock and equity grants, compensation history and many other forms of data routinely collected throughout the employment relationship. Moreover, the CPRA introduces the concept of “sensitive personal information,” which includes financial information, social security numbers, communications content, health information and biometrics, that must now be considered and addressed by the employer.

In order to comply with the CPRA, employers must prepare and provide a privacy notice to an employee or job applicant at or before the time personal information is collected, entered into specific data processing agreement with vendors, respect employee rights requests pursuant to CCPA with regards to their personal data, as well as other compliance steps.

For more information about the amendments contained in the CPRA, you can read our prior blog post here.

Employee Specific Privacy Policies

As privacy continues to be of growing concern among businesses, employees and consumers, it is imperative that employers adopt comprehensive and robust policies about privacy to let their employees know how their personal data will be collected, processed, stored and shared. These policies are essential for any company that requires the use and disclosure of an employee’s personal data for business purposes. Important employee privacy policies include general employee privacy policies, bring your own device policies, acceptable use policies, employee monitoring policies and biometric collection policies, to name a few. These policies increase transparency between the employer and employees, as well as set privacy expectations for employees.

Most often, employee privacy policies are implemented as a means to comply with applicable workplace privacy regulations and legal requirements, such as those under the CCPA/CPRA and any state employee monitoring laws mentioned above. For example, the Illinois Second District Court of Appeals recently held that Illinois’ Biometric Information Privacy Act (BIPA) requires private entities to publish written data retention and destruction policies simultaneously or prior to collecting biometric data, and in certain circumstances to receive employee consent prior to collection.

Effective policies should define what constitutes personal information and the means by which it may be collected. Further, the policy should clearly stipulate situations in which an employee should not assume that their data or communications are private. For example, an employee does not have a reasonable expectation of privacy in phone calls, texts, emails and social media communications that are transmitted on company-owned equipment. Similarly, software and websites that are not required for business purposes may be restricted according to the policy or blocked to prevent any issues. The policy should also specify under what conditions employee data will be disclosed.

In 2022 and heading into 2023, legislative developments and the increased individual awareness about personal data privacy are emphasizing the importance of employee privacy efforts. Employers must be aware of the evolving legal landscape that is increasing recognition of employee privacy and remain up to date on any new obligations that may result.

If you have any questions about the above or employee privacy, please contact Gianelle Duby; Amber Lawyer, CIPP/US & CIPP/E; Shannon Knapp, CIPP/US; or any attorney in Bond's cybersecurity and data privacy practice.


Healthcare and Cybersecurity: CIRCIA’s Potential Effect on Healthcare Entities

By: Gabriel S. Oberfield, Esq., M.S.J.

Welcome to 2023. As in 2022, we are likely to see continuing escalation of cyber intrusion threats to healthcare entities – and their data. Healthcare data breach already is far from a trivial matter – according to one expert, there have been more than 4,400 breaches during the span of 2009 to 2021 involving 500 or more records, and disclosure of healthcare records topping 300 million in number. At Bond, we will be tracking how our federal cybersecurity structure changes and adapts to these increased risks, what that means for healthcare providers and the regulations that apply to them, and how these changes aim to protect healthcare data integrity.

In March of 2022, President Biden signed the Cyber Incident Reporting for Critical Infrastructure Act of 2022 (CIRCIA), which requires the Cybersecurity and Infrastructure Security Agency (CISA) to develop and implement regulations requiring covered entities to report covered cyber incidents and ransomware payments to CISA. Covered entities under CIRCIA include some healthcare entities. As part of its rulemaking process, CISA issued a Request for Information last fall intended to inform its development of regulations that fundamentally may change the regulatory landscape. Review of the Request for Information is underway – and the implications of the results could be vast.

At a high level, CIRCIA ups the ante by indicating companies operating in the healthcare space and in other ‘critical infrastructure’ sectors report cyber incidents within 72 hours – and ransomware payments within 24 hours. In addition, by CIRCIA giving CISA the authority to develop those regulations, CISA may potentially include further compliance requirements beyond that what is currently required of healthcare entities. This important rulemaking development will continue throughout 2023, but it will not be implemented until after CISA’s rulemaking becomes final.

How does CIRCIA mesh with HIPAA and the various reporting requirements within? For instance, although CIRCIA seems to provide some allowance for avoidance of duplicative reporting if there already is a functionally similar reporting requirement in place (e.g., HIPAA), it may end up that the existing reporting requirements under HIPAA, (e.g., concerning breach notification, as enforced by HHS’s Office for Civil Rights), will fall below the bar and CIRCIA will require more. CISA will have a lot of say on that, and this is the first major rulemaking that this relatively new agency is taking on.

The public comments that were submitted on CIRCIA by healthcare entities are particularly telling. Organizations spell out concerns about duplication and unnecessary confusion; a number stressed the importance of cleanly implementing the CIRCIA provision that precludes CISA from requiring duplicative reporting (see CIRCIA at Section 2242(a)(5)(B)). Others emphasized that required reporting only should comprise data absolutely necessary for governmental operations, so as to protect data integrity wherever possible and to, where necessary, allow ongoing ‘ransom’ negotiations to continue out of the limelight when that benefits data retrieval efforts.

As CISA develops CIRCIA regulations during 2023, Bond will be watching closely. In the meantime, we encourage readers to avail themselves of useful healthcare cybersecurity resources, including those of the ‘405(d)’ task group (of which this author is a member), and for those readers in New York State, the New York Healthcare Cyber Alliance (which this author co-chairs) continues its work of linking healthcare delivery organizations to the resources that can improve their cyber posture.

For more information regarding healthcare and data privacy, contact Gabriel Oberfield or any attorney in Bond’s cybersecurity and data privacy practice.


Countdown to Data Privacy Day 2023

January 18, 2023

By: Cybersecurity and Data Privacy Practice

World Data Privacy Day is an international event that occurs each year on January 28. This event is aimed at raising awareness and promoting best practices related to privacy and data protection. The date commemorates the Jan. 28, 1981 signing of Convention 108, the first legally binding international treaty dealing with privacy and data protection. The celebration of World Data Privacy Day encourages individuals and businesses to become more aware of the rights and responsibilities associated with data privacy.

Given the growing digital presence in the modern world, privacy and data protection are relevant to all businesses—no matter the size. The ever-changing data privacy legal landscape influences the way we think about, collect, use and safeguard data. Few, if any, areas of the law have changed and developed as rapidly as data privacy over the last few years. Staying up to date on these new and amended laws and regulations is essential to ensure compliance and best practices.

The attorneys in Bond’s cybersecurity and data privacy practice are committed to providing comprehensive and practical advice to our clients, while staying up to date on the data privacy landscape. We will be counting down the days to World Data Privacy Day by providing you relevant information on various data protection matters. This campaign will feature information memoranda, articles, webinars and podcasts dedicated to highlighting relevant data privacy topics. Look out for more information from our group including information relating to: HIPAA; employee privacy; what’s on the horizon for privacy in the USA and internationally; the FTC’s settlement with EPIC Games; IP and Privacy and so much more. As we count down the days to World Data Privacy Day, there is no better time to assess your organization’s privacy and data protection practices. The attorneys at Bond are equipped with the industry experience to assist you in this process.

For more information regarding the information above or the specific compliance efforts businesses should be taking, contact any attorney in Bond’s cybersecurity and data privacy practice.