A 8833 — Establishes Understanding Artificial Intelligence Responsibility Act
A 8833 establishes the Understanding Artificial Intelligence Responsibility Act, which aims to create a framework for the responsible development and use of artificial intelligence systems in New York. The bill focuses on ensuring transparency, fairness, and accountability in AI systems to prevent algorithmic discrimination and protect privacy. It addresses harms such as biased decision-making in AI and unauthorized use of biometric data. The bill was introduced in the 2025-2026 legislative session and referred to the Science and Technology committee on January 7, 2026.
Linked Incidents
9Incidents this policy has been directly linked to
8-month-pregnant woman wrongfully arrested after AI facial recognition error in front of children
13-year-old student exposed to historically inaccurate AI-generated images via Google Gemini, leading to system-wide pause for retooling
New York City business owners receive illegal advice from MyCity AI chatbot, leading to bot shutdown in early 2026
Woman wrongfully arrested after AI facial recognition misidentification in US state she never visited
Black man wrongfully arrested after facial recognition misidentification leading to $8M settlement
Security screening system users misinformed about AI accuracy by Evolv Technologies per FTC allegations
Two Amazon employees file ADA discrimination complaints over AI accommodation evaluation system and return-to-office policy
Middle-aged man wrongfully arrested after AI misidentification in criminal investigation
Seattle resident banned from Madison Square Garden venue after t-shirt design linked via facial recognition to 2021 incident he did not attend
Related Incidents
Same harm domain, actors and location may differ
Orlando man wrongfully arrested after facial recognition misidentification by Orlando police
50-year-old woman wrongfully arrested after facial recognition misidentification in Tennessee leading to six-month custody without compensation
26-year-old South Asian software engineer wrongfully arrested after facial recognition misidentification in Milton Keynes theft case
Innocent South Asian man wrongfully arrested after AI facial recognition misidentification in Milton Keynes, leading to legal action and criticism of racial bias
Adult patron wrongfully arrested after facial recognition misidentification at Peppermill Casino
Related Legislation
Other policies covering the same harm domain