Military Tech

published : 2023-11-27

Israel's Use of AI in Hamas War Can Help Limit Collateral Damage 'If Executed Properly,' Expert Says

IDF Faces Criticism for Excessive Death and Collateral Damage in Gaza

An IDF soldier operating an AI-powered drone during a mission in Gaza. (Taken with Canon EOS R5)

The Israel Defense Forces (IDF) have enlisted the power of artificial intelligence (AI) in their ongoing conflict with Hamas in an effort to improve the precision of their targeting. As the IDF faces accusations of causing excessive death and collateral damage in Gaza, experts believe that the use of AI, if executed properly, could help limit these unintended consequences.

Mark Montgomery, a senior fellow at the Foundation for Defense of Democracies’ Center on Cyber and Technology Innovation, expressed optimism about the potential of AI in streamlining the targeting identification, evaluation, and assessment process. He noted that similar to the efforts of U.S. forces, the IDF is committed to reducing collateral damage and civilian casualties. By incorporating AI and machine learning tools, the targeting process can become more agile and executable, expediting target identification, review, and approval.

Montgomery emphasized that the inclusion of AI does not eliminate human involvement in the targeting process. Instead, it accelerates the timeline and complements the decision-making capabilities of human operators. While the use of AI cannot completely eradicate casualties, when combined with proper execution, it has the potential to reduce harm and enhance the effectiveness of military campaigns.

Mark Montgomery, expert in AI and military operations, discussing the use of AI in the IDF at a conference. (Taken with Nikon D850)

The IDF has been at the forefront of integrating AI into their military operations, employing it primarily for real-time visual targeting using tanks and drones, as well as for target selection based on environmental data. The IDF emphasizes that human review is an essential step in the process, ensuring that no fully automated decisions are made.

Critics have raised concerns about the potential controversy surrounding the use of AI in warfare. However, Montgomery argues that the IDF faces a challenging situation, with opposing pressures that would subject them to criticism regardless of their choices. He points out that Israel has made efforts to provide evidence for their actions, such as revealing the tunnels under the Al-Shifa Hospital, but the use of AI in examining sonar imaging could be misconstrued as manipulation. Nevertheless, based on available information, the IDF's use of AI is focused solely on improving targeting and identification capabilities.

Montgomery addresses the issue of collateral damage and civilian casualties by highlighting the complex nature of the environment in which the IDF operates. He explains that the tactics employed by Hamas, such as using civilians as shields and concealing war-fighting capabilities near hospitals, significantly contribute to civilian casualties. Thus, while AI may be a tool in the targeting process, it is not the driving force behind excessive collateral damage.

A close-up shot of AI algorithms running on a computer screen, illustrating the technology behind the IDF's targeting process. (Taken with Sony A7 III)

The ultimate goal of incorporating AI in the IDF's operations is to reduce harm and enhance the efficiency of military campaigns. By leveraging AI's capabilities, the IDF aims to decrease civilian casualties, expedite campaign execution, and effectively counter Hamas as a terrorist organization.

As the conflict continues, the potential impact of AI in warfare remains a subject of debate. However, experts assert that with proper execution and careful consideration of human oversight, AI has the potential to revolutionize military operations and minimize unintended consequences.