Home » today » World » Investigation Reveals Israeli Military Reliance on AI for Gaza Airstrikes

Investigation Reveals Israeli Military Reliance on AI for Gaza Airstrikes

AFFPDestruction in Rafah due to Israeli airstrikes

NOS News•today, 06:43•Changed today, 08:38

Based on artificial intelligence (AI), the Israeli military selected which individuals were targeted in bombings in Gaza, without careful human control, writes news site +972. According to the Israeli journalist who wrote the article, there was hardly any human supervision of the system Lavender and an air strike was deliberately waited until a terrorist suspect was home.

The Israeli military dismisses the article as a series of false accusations. In one, among other things, by The Guardian published statement the use of AI is recognized, but only for support. According to the army, the system referred to is “simply a database” with available information on armed militants.

‘Consideration must be extremely careful’

The progressive news site +972, a partnership between Israeli and Palestinian journalists, bases itself on conversations with six sources within the Israeli army. “If the information is correct, it is shocking,” says former commander of the Army Mart de Kruif. “Under the laws of war, the decision for an air strike must be extremely careful.” You cannot largely leave something like this to AI, De Kruif warns.

He adds that the Israeli military is known to “rely a lot” on artificial intelligence. According to him, the overall picture painted by the article is also consistent with available information about the heavy bombing of Gaza in the first weeks of the war.

At some point we trusted the system and only checked if the target was male.

An anonymous officer quoting +972

Defense specialist Peter Wijninga at the Hague Center for Strategic Studies wonders “whether this is indeed a bad situation”. “That depends on what that system is fed with,” he says. “What criteria are used to identify people or not so that they end up on the target list?” To assess this properly, he says that it is important to gain clarity about this afterwards.

According to the anonymous officials, unguided bombs were used systematically in the early days of the war, resulting in many civilian casualties. Wijninga calls it “worrying” that mainly unguided bombs were used during that period. “Such a bomb could also end up next to a house and therefore unintentionally cause damage to other houses where there may be no Hamas fighters at all.”

The article does not mention to what extent the AI ​​system is still used today, or how airstrikes are carried out today.

A total of 37,000 Palestinians are said to have been identified by Lavender as terror suspects and potential targets. According to the anonymous sources, they were selected based on varying information. From social media and camera images to intelligence reports and telephone data. “It was known that 10 percent of human targets were not members of Hamas,” the article’s author wrote.

‘Consent for civilian victims’

The officials who had to give approval, +972 reports, were given permission by senior officers to kill civilians if Hamas fighters were also eliminated. The rank of the Hamas target would have made little difference. According to anonymous sources, when a foot soldier was killed, fifteen to twenty civilian casualties were sometimes accepted as ‘collateral damage’. For senior Hamas commanders, that number would have been more than a hundred. Hard evidence for these claims is lacking.

The intelligence officers quoted using Lavender suggested that warfare was being taken over by AI. “At one point we relied on the automated system and only checked whether the target was male,” said an anonymous source. According to him, about twenty seconds were allotted for that last check.

Military: This is not a list of targets

Israel strongly denies that it operates this way. The military states that Lavender is an additional tool. The information in the system “is not a list of confirmed militants who may be targeted.” According to the army leadership, a thorough human assessment always precedes the decision whether or not to bomb. It also emphasizes that Hamas operates from residential areas “and systematically uses the population as a living shield.”

Correspondent Nasrah Habiballah:

“The publication is not making much waves in Israel. The Israeli army is in denial and the confidence of the Israeli people in the army is high. Almost everyone here has served in the army themselves, because there is compulsory military service. Many Israelis are convinced that They have the most moral army in the world and so they tend to believe the army when it comes to these things.”

More than 33,000 Palestinians have been killed in the war, according to the Hamas-run Health Ministry in Gaza. Nearly half of them, about 15,000 people, died in the first two months of the war. According to counts by Gazan authorities, which are considered reliable by UN agencies, more than half of the dead are women or children.

Civilians were killed by failed Hamas rocket launches, but those numbers would pale in comparison to those killed by Israeli airstrikes. According to an Israeli study, true newspaper Haaretz wrote about it in December, these bombings in October led to more than 6,747 deaths, 61 percent of whom were civilians. Israeli officials previously claimed against it The Times Of Israel that for every Hamas fighter killed, about two civilians died in Gaza.

‘Research needed’

Peace organization PAX Netherlands is calling for an international investigation into the described methods used by the Israeli army. “What kind of system is Lavender, what role did people play and were war crimes committed,” Daan Kayser, specialized in autonomous weapon systems, sums up his questions.

The question is whether Israel will release more information. Israel has been repeatedly accused of alleged war crimes in Gaza over the past year, but has not allowed independent investigators to determine what exactly happened in those incidents.

2024-04-05 04:43:19
#target #selection #Gaza #Israel #human #control

Leave a Comment

This site uses Akismet to reduce spam. Learn how your comment data is processed.