By using this site, you agree to our Privacy Policy and our Terms of Use. Close

UN expert flags possible illegality of attacks after report on AI use in Gaza war

Ben Saul, UN special rapporteur on human rights and counterterrorism, said if details in a report on AI-assisted targeting in Gaza prove to be true, “many Israeli strikes in Gaza would constitute the war crimes of launching disproportionate attacks”.

Saul was responding to a report in +972 Magazine and Hebrew-language media outlet Local Call, which revealed that the Israeli army has identified tens of thousands of Gaza Palestinians as potential targets using an AI-assisted targeting system called “Lavender”.


Israel’s AI tactics, resulting in high civilian casualties, being exported

The report by two Israeli publications, 972+ Magazine and Hebrew outlet Local Call, on Israel’s use of artificial intelligence in Gaza raises questions about other nations wanting the technology, says an author on the subject.

Antony Loewenstein – an Australian journalist and author of The Palestine Laboratory: How Israel Exports the Technology of Occupation Around the World – says Israel is planning to export the AI tech and its countries are eager to acquire it.

“Israel is currently trying to sell these tools to foreign entities, to governments, that are looking to what Israel’s doing in Gaza, not with disgust, but actually with admiration – and we’ll find out in the coming months and years who they may be,” said Loewenstein.

“My sense is it’s gonna be countries that are currently saying they’re opposed to what Israel is doing.”


Five to 10 ‘acceptable civilian deaths’ for every fighter targeted by Israelis using AI

Two Israeli media organisations are reporting the Israeli military has been using an AI-powered database called Lavender to isolate and identify bombing targets for air strikes in Gaza.

That database is responsible for drawing up kill lists of as many as 37,000 targets.

The unnamed Israeli intelligence officials who have been talking to these publications say Lavender had an error rate of about 10 percent. But that didn’t stop the Israelis from using it to fast-track the identification of often low-level Hamas operatives in Gaza and bombing them.

According to the publications, this has led to many of the thousands and thousands of civilian deaths inside Gaza. The humans that were interacting with the AI database were often just a rubber stamp. They would scrutinise this kill list for perhaps 20 seconds before deciding whether or not to give the go-ahead for an air strike.

Also, the fact there were five to 10 acceptable civilian deaths for every one Palestinian fighter that was the intended target – you can see why there are so many civilian deaths in Gaza.


'Low-level Hamas operatives' include the civilian police force protecting aid convoys. Hence the shelling of aid convoys. Any government employee is a target.