Israel combines military AIs 'Lavender' and 'Where's Daddy?' to operate as a 'kill chain'



In October 2023, the Palestinian Sunni Islamic fundamentalist organization Hamas launched a large-scale attack on Israel, which in turn led to a counterattack in the Gaza Strip in Palestine, sparking a large-scale conflict. It has been reported that the Israeli Defense Forces are using an AI called ' Lavender ' to assassinate Hamas fighters, but it has also been reported that a combo called 'kill chain' that combines Lavender with an AI tracking system is killing many Gaza residents, including civilians.

Gaza war: Israel using AI to identify human targets raising fears that innocents are being caught in the net
https://theconversation.com/gaza-war-israel-using-ai-to-identify-human-targets-raising-fears-that-innocents-are-being-caught-in-the-net-227422



'Lavender': The AI machine directing Israel's bombing spree in Gaza | +972 Magazine
https://www.972mag.com/lavender-ai-israeli-army-gaza/

The Israel Defense Forces have developed an AI-based system called 'Lavender' to assassinate suspects who belong to the military wing of Hamas or Islamic Jihad (PIJ). Lavender analyzes data collected from surveillance of 2.3 million residents of the Gaza Strip and rates them on a scale of 1 to 100 as to the likelihood that they are Hamas or Islamic Jihad fighters. Furthermore, individuals with higher scores are automatically targeted for assassination. Lavender has listed as many as 37,000 Palestinians as assassination targets.

Israel uses AI system 'Lavender' to identify and attack 37,000 targets including Hamas, with a 90% identification rate - GIGAZINE



According to the news site +972 Magazine, the Israel Defense Forces are using a tracking system called 'Where's Daddy?' in combination with Lavender. Specifically, a person targeted for assassination by Lavender is registered with Where's Daddy? and tracked. When Where's Daddy? detects that the target has entered their home, a notification is sent to the military personnel in charge, and the target's home is designated as a bombing target. This allows the Israel Defense Forces to assassinate the target and his or her family by bombing. This combo is called the 'kill chain.'

The bombs used for the bombing are unguided bombs that do not have a guidance device, and are called 'Dumb Bombs' as opposed to smart bombs. According to +972 Magazine, the Israel Defense Forces used these Dumb Bombs to reduce costs when assassinating low-ranking fighters. +972 Magazine criticized, 'The Israel Defense Forces used Dumb Bombs with the knowledge that they would harm not only the target fighters, but also their families and neighboring civilians.'

According to an anonymous source who spoke to +972 Magazine, the IDF was allowed to kill up to 15-20 civilians for every low-ranking fighter, and the number of civilian casualties could reach more than 100 for high-ranking enemy officials, a move that +972 Magazine strongly condemns as a violation of international law.



According to the source, the IDF used software to automatically calculate the number of civilians in houses targeted for bombing. The software estimated the number of residents in each house based on pre-war data and took into account what percentage of residents had evacuated from the area to estimate the number of residents. To save time, the IDF did not actually monitor the houses to confirm the number of residents, but instead used the software to estimate the number of civilians.

However, the source said, 'This model was far from reality and did not take into account the large changes in the number of occupants in homes before and after evacuation.'

The source also said that there was a time lag between when Where's Daddy? detected the target's return home and when it actually bombed the house. This time lag resulted in the target's house being bombed even though the target had gone out again. As a result, the source said that there were also cases where the target could not be assassinated and only their family was bombed.



Furthermore, in past wars, after assassinating high-ranking officials, military personnel would wiretap their phone calls to confirm the number of civilian casualties, but in this war, this verification process for low-ranking combatants has been omitted. As a result, the military does not know how many civilians were killed in the bombings, and for low-ranking combatants, they do not even know if they were killed themselves.

+972 Magazine criticized the Israeli Defense Forces for using AI to massacre civilians living in the Gaza Strip while prioritizing efficiency, and denounced the indiscriminate bombing, which involves not only fighters but also their families, as an act against humanity that clearly ignores international law. The academic news media The Conversation warned against the excessive development of military AI, saying, 'Speed and lethality are important in military technology. However, prioritizing AI can undermine human agency. This is necessary because the human cognitive system is relatively slow. It also undermines human responsibility for the results generated by the computer.'

According to The Conversation, the Israel Defense Forces quickly denied using AI systems.

in Software,   Video, Posted by log1i_yk