New York’s New AI Guardrails on AI Generated Performers and Postmortem Digital Replicas: What Institutions of Higher Education Need to Know
January 15, 2026
By: Kymberley Walcott-Aggrey and Gabriel S. Oberfield, Esq., M.S.J.
New York has enacted two significant AI-related laws aimed at “protecting consumers and boosting AI transparency” as part of the state’s AI regulatory agenda at the very moment the federal government is asserting a contrary, preemptive posture in the space. The result is a potentially fast-evolving compliance landscape for institutions of higher education (IHEs) and other entities that incorporate AI in recruitment, development and other promotional or marketing materials created or disseminated in New York, or that use AI-generated digital replicas of deceased individuals.
The New Laws at a Glance
Governor Hochul signed two bills that establish concrete limitations on certain AI uses. The first bill, S. 8420-A/A.8887-B, likely has the broadest potential impact for IHEs and their communications teams in recruitment, alumni, development and other functions using AI to create content, as it requires disclosure when advertisements include “AI-generated synthetic performers,” defined as “digitally created media that appear as a real person.” S. 8420-A/A.8887-B imposes civil penalties – $1,000 for a first violation and $5,000 for subsequent violations. The law takes effect 180 days after signing, making mid-June 2026 a critical compliance date for IHEs that use or procure AI-enhanced media.
The second bill, S.8391/A.8882, requires consent from heirs or executors before using the name, image or likeness of a person who has died via digital replicas. S.8391/A.8882 amends definitions of ‘deceased performer,’ ‘deceased personality,’ and ‘digital replica,’ in relation to the right of publicity, effectively restricting the use of digital replicas. This law is immediately enforceable and requires violators to pay the greater of $2,000 or compensatory damages, plus any profits attributable to the unauthorized use. Courts may also award punitive damages.
Federal Preemption Risk
On Dec. 11, 2025, the President issued an Executive Order (EO) seeking to curb state regulation of AI and to avoid a “patchwork” of divergent state standards. The order directs the U.S. Attorney General to challenge state laws perceived to impede AI development. Notwithstanding questions concerning the EO’s enforceability, the EO introduces uncertainty about whether and when federal action might preempt or chill enforcement of state AI statutes.
Practical Implications for IHEs
IHEs routinely produce and procure content across admissions, development, athletics, performing arts, continuing education and online programs – functions where AI tools can generate or manipulate images, voices and likenesses. To avoid penalties under S. 8420-A / A.8887-B, campuses should ensure that any advertisement containing AI-generated content that could meet the definition of a “performer” that “appear[s] as a real person” include required disclosures before the mid-June 2026 effective date. While the law’s definition would not reach AI-generated illustrations or stylized images, disclosure of AI use in these contexts may still be covered by a college or university’s own internal policies.
For deceased individuals – such as notable figures from an institution’s history, alumni, donors, faculty, artists or athletes – consent must be obtained before deploying a digital replica in promotional materials, virtual tours or commemorations. With the law already enforceable, IHEs should already be taking steps to ensure that institutional content does not include such AI-modified likenesses (or that appropriate consents are provided).
Recommended Next Steps for IHEs
These obligations carry operational and contractual implications. Vendor agreements for marketing, creative services and media production should be updated to include disclosures for AI- generated content where applicable and the consent requirement for the use of postmortem digital replicas. Internal review processes should identify where synthetic media appears in IHE channels and track disclosure and consent status. IHEs also should monitor federal developments and potential litigation over the EOs preemptive thrust, recognizing that compliance must align with current state law while remaining adaptable to potential federal shifts.
This piece complements the article titled “New Laws in New York Apply Guardrails to AI While the President Seeks to Federally ‘Legislate’ AI Through an Executive Order” authored by Bond Attorneys Gabriel S. Oberfield and Mark A. Berman and published in the New York Law Journal, available here.
Bond’s Higher Education practice group will monitor developments concerning these and related laws. For more information, please contact Kymberly Walcott-Aggrey, Gabriel S. Oberfield or any attorney in Bond’s higher education practice group with whom you work regularly.
