Also relevant here
Israel’s use of Lavender system is ‘an AI-assisted genocide’: Expert
In a recent report published by the Israeli-Palestinian publication +972 Magazine and Hebrew-language media outlet Local Call, it was revealed that the Israeli army identified tens of thousands of Gaza Palestinians as potential targets using an AI targeting system called, “Lavender”.
Marc Owen Jones, an assistant professor in Middle East Studies and digital humanities at Hamid bin Khalifa University, spoke to Al Jazeera about the report:
“It is becoming increasingly clear that Israel is deploying untested AI systems that have not gone through transparent evaluation to help make decisions about the life and death of civilians,” he said in an interview.
“The fact that the operators can tweak the algorithms based on pressure from senior officers to find more targets suggests they are actually devolving accountability and selection to AI and using a computer system to avoid moral accountability.”
“The operators themselves have pointed to how the AI is simply an efficient killing machine,” he said, “and it is explicitly not used to reduce civilian casualties but to find more targets”.
“This helps explain how over 32,000 people have been killed. Let’s be clear: This is an AI-assisted genocide, and going forward, there needs to be a call for a moratorium on the use of AI in the war.
He added, “It’s unlikely, without pressure from Israel’s allies, that there will be an end to [AI’s] use.”