Israeli military uses AI-assisted Lavender database to compile Gaza target lists, leading to civilian casualties
Summary
Israeli armed forces are reported to be employing an AI‑assisted database called Lavender to compile target lists of up to 37,000 individuals in the Gaza Strip. The system, described as untested with an estimated 10 % error rate, is used to accelerate identification of Hamas operatives, leading to civilian casualties. Human‑rights experts, UN officials, and scholars have warned the deployment may constitute war crimes and urged a moratorium on AI in armed conflict. Gaza’s Ministry of Health estimates more than 33,000 Palestinians killed and over 75,000 wounded since the conflict intensified.
Incident Details
Harms arising from AI or automated systems making consequential decisions without adequate oversight.
Who Was Affected
Sources
1This incident is documented by a single source. Source count reflects coverage in our monitored feeds, not the totality of reporting, and we do not evaluate publication quality.