What is the name of the AI tool? “lavender”
This week, Israeli journalist and filmmaker Yuval Abrahams spoke about the existence of the Lavender program and its implementation in Israel's operations in Gaza after Hamas's deadly terrorist attack in southern Israel on October 7. Published a long exposé. Abraham's report was published in the left-wing Israeli English-language website +972 magazine and its sister Hebrew publication Local Call, and was based on the testimony of six anonymous Israeli intelligence officers. ing. All served during the war and were “directly involved” in the use of AI to select targets for elimination. According to Abraham, Lavender identified as many as 37,000 Palestinians and their homes for assassination. (The IDF denied to reporters that such a “kill list” existed, characterizing the program as simply a database for cross-referencing intelligence sources.) White House National Security Press Secretary John Kirby told CNN on Thursday that the United States is investigating. The media obviously he's reporting on AI tools.
“In the early stages of the war, the military gave full approval for officers to adopt Lavender's kill lists, but the machines never thoroughly ascertained why they made such choices or what they were based on. There was no need to examine raw intelligence data,” Abraham wrote. .
“One source said human personnel often serve only as a “rubber stamp'' to machine decisions, and that they would typically only spend about “20 seconds'' personally on each target before authorizing the bombing.'' added. -The marked target is a man,” he added. “This means the system is known to cause what it deems an 'error' in about 10 per cent of cases, and occasionally marks individuals with only loose or no ties to extremist groups. Despite that.”
This may help explain the scale of destruction and casualties that Israel has unleashed across Gaza in an effort to punish Hamas. In the early stages of the conflict between Israel and Hamas, the IDF carried out a longer, human-driven process of selecting targets based on intelligence and other data. At a time of deep anger and trauma in Israel following the October 7 Hamas attack, lavender could have helped Israeli commanders plan swift and far-reaching retaliation.
“We were always under pressure to 'bring in more targets.' They really yelled at us,” said one operative, access to the account first surfaced by +972. He said this in a statement published in the UK's Guardian newspaper, which obtained the following.
Many of the weapons dropped by Israel on targets allegedly chosen by Lavender were “dumb” bombs, heavy, unguided weapons that caused significant damage and loss of civilian life. Abraham's report said Israeli authorities did not want to “waste” more expensive precision-guided munitions on the many lower-level Hamas “operatives” identified in the program. And they also showed little aversion to dropping bombs on buildings where targeted families were sleeping, he wrote.
“We had no interest in killing people. [Hamas] Agent A told +972 and Local Call. “On the contrary, the IDF, as a first option, bombed their homes without hesitation. It is much easier to bomb the homes of families. This system is built to look for them in such situations. I am.”
Throughout the war, widespread concern has been expressed about Israel's targeting strategy and methods.. Brian Kassner, Amnesty International's senior crisis adviser and weapons researcher, told his colleagues in December that “under the best of circumstances, it is difficult to distinguish between legitimate military targets and civilians.” Ta. “Thus, based on fundamental rules of discretion, Israeli forces should use the most precise weapons available and the smallest weapons appropriate for the target.”
Following the Lavender revelations, the Israel Defense Forces issued a statement calling some of Abraham's reports “baseless” and objecting to his characterization of the AI program. It is “not a system, but merely a database intended to cross-reference sources to create an up-to-date layer of information on military operatives of terrorist organizations,” the IDF announced in October 2016. This was stated in the written answer. Guardian.
“The IDF does not use artificial intelligence systems to identify terrorist operatives or attempt to predict whether a person is a terrorist,” it added. “Information systems are simply tools for analysts in the target identification process.”
This week's attack by an Israeli drone on a convoy of prominent food aid organization World Central Kitchen, killing seven of its employees, has put a spotlight on Israel's war effort. In a phone call with Israeli Prime Minister Benjamin Netanyahu on Thursday, President Biden reportedly urged Israel to pivot to better protect civilian lives and allow the flow of aid. He called for substantive measures to be taken.
Separately, hundreds of prominent British lawyers and judges have written to their government, calling for an end to arms sales to Israel to avoid “complicity in serious violations of international law”. .
The use of AI technology is just one part of what worries human rights activists about Israel's actions in Gaza. But it points to a bleak future. lavender, Observed Adil HaqAn international law expert at Rutgers University, he is “every international humanitarian lawyer's nightmare come true.”