Home » today » World » Israeli Military’s AI-Powered Database Identified 37,000 Hamas Targets in Gaza, Intelligence Sources Reveal

Israeli Military’s AI-Powered Database Identified 37,000 Hamas Targets in Gaza, Intelligence Sources Reveal




Israeli Military’s Use of AI-Powered Database in Gaza

The Israeli Military’s Use of AI Database in Gaza Raises Legal and Moral Questions

The Israeli military’s bombing campaign in Gaza has come under scrutiny with the revelation of its undisclosed use of an AI-powered database, known as Lavender. According to intelligence sources, Lavender identified 37,000 potential targets linked to Hamas during the six-month war. The use of AI systems in warfare has prompted concerns about the transformation of military operations and the relationship between humans and machines.

Controversial Roles and Cold Calculations

The testimonies of six intelligence officers reveal the candid experiences of Israeli intelligence officials who employed Lavender to identify Hamas and Palestinian Islamic Jihad (PIJ) targets. The officers express their reliance on the AI system, stating that it provided statistical accuracy and made target identification easier. Lavender’s automated database of potential targets raised questions about the value of human analysis in the process.

Implications for Palestinian Civilians

The accounts of intelligence officers highlight concerns about the large number of Palestinian civilian casualties, particularly in the early stages of the conflict. The Israeli military’s use of unguided munitions, such as “dumb bombs,” has led to the destruction of numerous homes and the loss of civilian lives. The high death toll and the impact on Palestinian families reflect alarming consequences of the Israeli military’s AI-enhanced operations.

The Development and Utilization of Lavender

Developed by the Israel Defense Forces’ elite intelligence division, Unit 8200, Lavender served as a target identification tool by categorizing individuals based on their alleged links to Hamas or PIJ. The system generated a database of tens of thousands of individuals associated with Hamas’s military wing. Another AI system, called “the Gospel,” focused on recommending structures as targets rather than individuals.

Contemplating the Cost and Collateral Damage

According to the intelligence officers, the military’s targeting approach allowed for significant collateral damage and civilian casualties. Pre-authorized allowances for the estimated number of civilians that could be killed per strike were reportedly in place, varying based on target seniority. The fluctuating limit on collateral damage further raises legal concerns and questions about the IDF’s proportionality assessment.

A Shift in Targeting Tactics and Consequences

The intensive bombardments during the conflict demonstrate a shift in the IDF’s strategy, with a focus on targeting low-ranking militants and individuals believed to be in their homes. This approach posed significant risks to civilian populations, and the IDF apparently authorized the killing of civilians during strikes on low-ranking militants. The casualty threshold fluctuated throughout the war, “a policy so permissive that in my opinion it had an element of revenge,” expressed one intelligence officer.

The use of AI systems and the consequential high death toll raise legal and moral questions regarding proportionality and the military’s duty of care towards civilian populations. The IDF, while stating that its operations adhere to the rules of proportionality, is facing mounting concerns about the outcome of its AI-enhanced warfare tactics.


Leave a Comment

This site uses Akismet to reduce spam. Learn how your comment data is processed.