AB 331 — Automated Decision Tools
AB 331 addresses the use of automated decision tools by requiring transparency and accountability in their deployment. The bill aims to mitigate risks of algorithmic discrimination by ensuring that individuals impacted by automated decisions are informed and provided with recourse. It applies to public agencies that use such tools in decision-making processes affecting individuals. The legislation is intended to promote fairness and prevent discriminatory outcomes in automated systems.
Related Incidents
Same harm domain, actors and location may differ
Orlando man wrongfully arrested after facial recognition misidentification by Orlando police
50-year-old woman wrongfully arrested after facial recognition misidentification in Tennessee leading to six-month custody without compensation
26-year-old South Asian software engineer wrongfully arrested after facial recognition misidentification in Milton Keynes theft case
Innocent South Asian man wrongfully arrested after AI facial recognition misidentification in Milton Keynes, leading to legal action and criticism of racial bias
Adult patron wrongfully arrested after facial recognition misidentification at Peppermill Casino
Related Legislation
Other policies covering the same harm domain