Israeli military uses AI to select targets in Gaza

Israel is using an AI system called Lavender AI to create a kill list of at least 37,000 people in Gaza linked to the Hamas terrorist group, according to a new report by Israeli magazine +972, confirmed by The Guardian.

Lavender AI is the second artificial intelligence system discovered after the Israeli system The Gospel was revealed last year. The difference is that The Gospel targets buildings, while Lavender targets people.

The new report quotes six unnamed Israeli intelligence officers who told +972 that the Israeli military relied almost entirely on Lavender in the first weeks of the war, despite the fact that it was known to misidentify potential targets as terrorists.

According to +972, the military personnel who reviewed the results of the AI’s work spent about 20 seconds on each decision.

The Lavender AI system reportedly works by analyzing information collected on almost all 2.3 million Palestinians in the Gaza Strip through a mass surveillance system. It assesses the likelihood of a person belonging to Hamas using a non-transparent ranking system.

Each Palestinian is assigned a rating from 1 to 100, which allegedly determines how likely they are to be a member of the terrorist group.

“Lavender learns to identify characteristics of known Hamas and [Palestinian Islamic Jihad] operatives, whose information was fed to the machine as training data, and then to locate these same characteristics — also called “features” — among the general population, the sources explained. An individual found to have several different incriminating features will reach a high rating, and thus automatically becomes a potential target for assassination,” Gizmodo quotes +972 magazine.

The Israeli command gave officers full approval to use Lavender to target targets in Gaza and did not require a thorough review of “why the machine made the choice it did, or to examine the raw intelligence on which it was based.”

The people who reviewed Lavender’s targeting decisions mostly simply checked to see if the target was male, although internal reviews indicated that at least 10% of the targets had no connection to Hamas. It is unclear how these internal checks were conducted and whether the percentage was not much higher.

According to +972, most of the targets were targeted in their homes. Another automated system used in conjunction with Lavender, called “Where’s Daddy,” was used to target targets in their family homes.

“We were not interested in killing [Hamas] operatives only when they were in a military building or engaged in a military activity,” an anonymous Israeli intelligence officer told +972. “On the contrary, the IDF bombed them in homes without hesitation, as a first option. It’s much easier to bomb a family’s home. The system is built to look for them in these situations.”

The new report also claims that the targets identified by Lavender were only junior militants, meaning that the Israeli military preferred to use unguided munitions to avoid wasting expensive bombs on relatively inconsequential targets.