New Delhi:
A report has surfaced claiming that the Israeli military used advanced artificial intelligence (AI) systems in the Gaza bombing campaign. These systems, named Lavender and Gospel, play a central role in the IDF's targeting strategy and have sparked debate about the ethical and legal implications of their deployment.
What is Lavender AI?
Lavender was developed by Israel's elite intelligence division, Unit 8200, and serves as an AI-powered database designed to identify potential targets associated with Hamas and Palestinian Islamic Jihad (PIJ). Lavender uses machine learning algorithms and processes vast amounts of data to accurately identify individuals who are considered “junior” extremists within these armed groups.
Lavender initially identified as many as 37,000 Palestinian men with ties to Hamas and PIJ, according to reports by Israeli-Palestinian publication +972 Magazine and Hebrew-language broadcaster Local Call. His use of AI to identify targets represents a major change in the functioning of Israel's intelligence agencies, Mossad and Shin Bet, which rely on more labor-intensive human decision-making.
Soldiers can make split-second decisions, taking less than 20 seconds to decide whether or not to bomb an identified target based on Lavender's information, primarily to confirm the target's gender. It happened often. Human soldiers often followed the machine's information without question, even though the AI program had a margin of error of up to 10 percent, meaning it could be wrong up to 10 percent of the time. The report said the program often targeted individuals with little or no ties to Hamas.
What is Gospel AI?
Gospel is also an AI system that works by automatically generating targets based on AI recommendations. Unlike Lavender, which identifies human targets, Gospel reportedly identifies structures and buildings as targets.
“It is a system that allows targets to be created at high speed using automatic tools and works by improving accurate and high-quality intelligence material according to the requirements. With the help of artificial intelligence , by rapidly and automatically extracting data to generate cutting-edge intelligence-recommendations for researchers, the goal of which is a perfect match between machine recommendations and human-performed identifications. ” the IDF said in a statement.
The specific data sources supplied to The Gospel remain private. However, experts suggest that AI-driven targeting systems typically analyze diverse datasets such as drone imagery, intercepted communications, surveillance data, and individual and group behavior patterns.
Ethical and legal concerns
The use of lavender and gospel in Israeli bombing campaigns represents a major advance at the intersection of AI and modern warfare, but it also raises ethical and legal concerns. Although these technologies offer potential benefits in target identification and operational efficiency, their deployment raises moral and legal dilemmas.