By using this site, you agree to our Privacy Policy and our Terms of Use. Close



And Israel is exporting the killer AI...

It's not us committing war crimes, it's the AI!


Israel’s Lavender AI concerning from a legal, moral, humanitarian perspective

Sai Bourothu, a researcher on the Automated Decision Research team at the Stop Killer Robots coalition, has spoken to Al Jazeera about Israel’s use of the Lavender system, an AI-powered database, to identify targets for bombing in Gaza.

She said the increasing use of such processing systems in conflict is “deeply concerning from a legal, moral and humanitarian perspective”.

“While the Habsora [or Gospel] system used AI to identify targets such as buildings and structures, the Lavender system generates human targets,” she said.

The use of such a system by Israel in the Gaza Strip, she said, raises “grave concerns over the increasing use of autonomy in conflict, digital dehumanisation, artificial intelligence, automation bias and human control in the use of force”.

She noted that these systems’ rapid generation of targets and the Israeli military’s sweeping approval of recommended targets “brings into question compliance with international humanitarian law, particularly the principles of distinction and proportionality, and raises serious ethical concerns around responsibility and accountability for the use of force”.

“The reduction of people to data points through the use of these systems has contributed to accelerating digital dehumanisation in war,” she said.

Last edited by SvennoJ - on 04 April 2024