According to a report, Israel is employing artificial intelligence to aid in selecting bombing targets in Gaza.

As of my last update in January 2022, Hamas is considered a terrorist organization by several countries and international bodies, including the United States, Israel, the European Union, and others.

According to an investigation by +972 Magazine and Local Call, the Israeli military has been employing artificial intelligence to aid in identifying bombing targets in Gaza. Citing six Israeli intelligence officials allegedly involved in the program, the report suggests that human scrutiny of the suggested targets was minimal. The AI tool, known as “Lavender,” reportedly has a 10% error rate.

While the Israel Defence Forces (IDF) did not dispute the existence of the tool, they denied using AI to identify suspected terrorists. Instead, they emphasized that information systems serve as tools for analysts and that efforts are made to minimize harm to civilians during strikes.

However, according to one official quoted in the investigation, human personnel often merely acted as a “rubber stamp” for the machine’s decisions, spending minimal time verifying targets before authorizing bombings. This revelation comes amidst increased international scrutiny of Israel’s military actions, particularly after air strikes killed foreign aid workers in Gaza.

The investigation highlights the humanitarian crisis in Gaza, with tens of thousands of casualties and a significant portion of the population suffering from severe hunger, according to United Nations-backed reports.

The author of the investigation, Yuval Abraham, previously discussed the Israeli military’s heavy reliance on artificial intelligence for target generation with limited human oversight.

The IDF clarified that they do not use AI to identify terrorist operatives but rather rely on databases to gather intelligence, with human officers responsible for ensuring targets comply with international law and IDF directives. This process aligns with descriptions provided in the investigation.

Attacks carried out during the night.

The magazine also alleged that the Israeli army consistently targeted individuals in their homes, often conducting these attacks at night when entire families were present.

According to the report, thousands of Palestinians, including many women, children, and individuals uninvolved in fighting, were killed by Israeli airstrikes, particularly during the initial weeks of the conflict, due to decisions made by the AI program.

Sources cited in the report indicated that when targeting suspected junior militants, the army opted to use unguided missiles known as “dumb bombs,” which have the potential to cause widespread damage.

Residents of the Maghazi refugee camp in the central Gaza Strip assess the destruction caused by an Israeli airstrike on a residential building, which occurred on Friday, March 29, 2024.

CNN reported in December that almost half of the 29,000 air-to-surface munitions dropped on Gaza last fall were dumb bombs. These types of bombs can pose a greater threat to civilians, particularly in densely populated areas like Gaza.

The IDF says they don’t attack if it would cause a lot of harm to civilians and tries to avoid hurting them as much as possible during their operations.

The Israeli military looks at targets before attacking and uses the right weapons based on operational and humanitarian reasons. They consider the structure and location of the target, the surrounding area, possible impact on nearby civilians, important buildings, and more.

Israel leaders believe that they need to use powerful weapons to defeat Hamas, a group that has killed over 1,200 people in Israel and also took many hostages, causing the current war.

Leave a Reply

Your email address will not be published. Required fields are marked *

You May Also Like