- Israel's AI system “Where's Daddy?'' tracks suspected Hamas militants to their homes.
- one anonymous police officer told +972 Magazine and Local Call.
- The IDF said it was making “various efforts to reduce casualties to civilians as much as possible.”
As civilian casualties continue to rise in the war-torn Gaza Strip, reports that Israel is using artificial intelligence (AI) to target Hamas militants have come under increasing scrutiny.
Israeli news outlets +972 Magazine and Local Call reported earlier this month that the Israeli military has relied heavily on two AI tools in previous conflicts: “Lavender” and “Where's Daddy?” That's what it means.
According to the report, “Lavender'' identifies suspected Hamas and Palestinian Islamic Jihad (PIJ) militants and their homes, while “Where's Daddy'' tracks these targets and makes sure they return home? It names six Israeli intelligence officers who were using the AI system, with the intention of notifying the Israeli military when the situation arises. Appeared in operations in Gaza, including “Where's Daddy?''
“We had no interest in killing people. [Hamas] You can only act as an operative when you are inside a military building or engaged in military activities,” one police officer told +972 on a local call.
“On the contrary, the IDF bombed their homes without hesitation as a first option. It is much easier to bomb a family's home. The system is built to look for them in these situations. “There is,” they added.
Another official told the publication that the technique often resulted in the deaths of civilians, describing it as “collateral damage.”
The Israel Defense Forces (IDF) said in a statement:The IDF completely rejects claims about policies that kill tens of thousands of people in their own homes. ”
Misidentification
The “Lavender” system is also known to sometimes identify targets with weak or no ties to extremist groups, sources said, making “mistakes” in about 10% of incidents. he added.
These false identifications also include people with the same name or nickname as the militants, or who were in possession of devices previously used by the militants, the sources added.
Briana Rosen, a senior research fellow at JustSecurity and a strategy and policy fellow at Oxford University's Blavatnik School of Government, estimates the margin of error is even higher.
“Israel's permissive targeting standards and faulty AI output are likely further exacerbating the risk to civilians,” Rosen said, adding that the risk to civilians will increase “as the war accelerates.” Then he added.
“This suggests that target identification and other preventive obligations required by international law may be much more difficult to meet, and more civilians may be misidentified and mistakenly killed,” she said. continued.
Officers also told +972 Magazine and Local Call that there was minimal human intervention in the target identification process, essentially “rubber stamping” the machine's picks after “just over 20 seconds” of consideration. (which was almost double). Make sure your target is male.
“Automated imprecise and biased targeting is actually more like indiscriminate targeting,” Heidy Khlaaf, who previously worked as engineering director of machine learning assurance at cybersecurity firm Trail of Bits, told Politico.
“Dumb” bomb
Sources say the Israel Defense Forces are trying to save on more expensive munitions by using unguided “dumb” bombs against young Hamas operatives.
CNN reported in December, citing a report from the Office of the Director of National Intelligence, that nearly half of the Israeli munitions used in the Gaza attack were “dumb” bombs.
President Joe Biden warned the country at the time that it could lose international support due to “indiscriminate bombing” of the Gaza Strip, even as it continues to send arms to Israel.
civilian death toll
The use of such bombs, combined with Israel's seemingly unauthorized targeting methods, led to a high civilian death toll in the Gaza Strip.
Some have questioned whether Israel adheres to international law and the principle of proportionality, which states that “accidental loss of civilian life, injury to civilians, damage to civilian goods, etc.” It is intended to prohibit “attacks that are expected to cause damage, or any combination thereof.” It would be excessive in relation to the anticipated concrete and direct military advantages. ”
Brianna Rosen told BI that Israel has a “very permissive interpretation” of international law and that “precautionary measures are necessary.”
One police officer told +972 Magazine and Local Call that “there really was no proportionality principle.”
“Anyone who has worn a Hamas uniform in the past year or two could be bombed 20 times.” [civilians killed as] Collateral damage can occur even without special permission,” they added.
IDF reaction
In a previous statement to Business Insider, IDF representatives said the report was “misleading and contains numerous false claims.”
“Analysts should conduct independent investigations and verify that identified targets meet the relevant definitions in accordance with international law and additional restrictions set out in IDF directives,” they added.
The IDF also issued a statement online following the investigation, saying, “Contrary to claims, the IDF does not use artificial intelligence systems to identify terrorist operatives or attempt to predict whether a person is a terrorist. “I haven't.”
“The Israel Defense Forces makes various efforts to reduce the harm to civilians as much as possible under the operational circumstances during an attack,” the statement said.
Briana Rosen called the IDF statement an “extraordinary situation,” adding that it “doesn't say much at all.”
+972 Magazine and Local Call may shed some light on how Israel is implementing AI into its operations, but much remains unknown.
Sarah Yeager, Washington director at Human Rights Watch, told Politico that “we have no idea. It's like a black box” regarding proportionality and Israel's use of technology.
Since the October 7 Hamas attack in Israel, which killed around 1,200 people and took about 240 hostages, the Gaza Health Ministry said more than 32,000 Palestinians had been killed in the area.