H 792 — An Act Relating To Liability Standards For Developers And Deployers Of Artificial Intelligence Systems
This bill establishes liability standards for developers and deployers of artificial intelligence systems. It aims to address potential harms arising from the use of AI, particularly focusing on algorithmic discrimination and accountability. The legislation is intended to ensure responsible deployment of AI systems and protect individuals from discriminatory outcomes. The bill was first read and referred to the Committee on Commerce and Economic Development on January 28, 2026.
Linked Incidents
8Incidents this policy has been directly linked to
13-year-old student exposed to historically inaccurate AI-generated images via Google Gemini, leading to system-wide pause for retooling
Māori woman wrongfully accused of shoplifting after facial recognition misidentification at New World supermarket leading to racial discrimination complaint
8-month-pregnant woman wrongfully arrested after AI facial recognition error in front of children
Woman wrongfully arrested after AI facial recognition misidentification in US state she never visited
Black man wrongfully arrested after facial recognition misidentification leading to $8M settlement
Security screening system users misinformed about AI accuracy by Evolv Technologies per FTC allegations
Middle-aged man wrongfully arrested after AI misidentification in criminal investigation
29-year-old Black man wrongfully arrested after facial recognition misidentification in Georgia leading to federal lawsuit
Related Incidents
Same harm domain, actors and location may differ
Orlando man wrongfully arrested after facial recognition misidentification by Orlando police
50-year-old woman wrongfully arrested after facial recognition misidentification in Tennessee leading to six-month custody without compensation
26-year-old South Asian software engineer wrongfully arrested after facial recognition misidentification in Milton Keynes theft case
Innocent South Asian man wrongfully arrested after AI facial recognition misidentification in Milton Keynes, leading to legal action and criticism of racial bias
Adult patron wrongfully arrested after facial recognition misidentification at Peppermill Casino
Related Legislation
Other policies covering the same harm domain