Lifestyle
Wearing a mask before going for a walk out, keeping doors and windows shut, and vacuuming often, as well as changing air
an AI-based programme called Lavender to generate targets for assassination, resulting in scores of bombings based on decisions made with little human oversight
Digital Desk: In the midst of conflict, the use of AI technology like the Lavender system underscores the ethical dilemmas and complexities of modern warfare. The system's alleged role in identifying targets, including low-level operatives, raises concerns about civilian casualties and adherence to international law.
In the weeks following Hamas' surprise attack on Israel on October 7, the Israel Defence Forces allegedly intentionally targeted civilian homes and used an AI-based programme called Lavender to generate targets for assassination, resulting in scores of bombings based on decisions made with little human oversight.
According to the report, Israeli intelligence officers describe a reliance on Lavender's recommendations, sometimes with minimal human oversight, leading to airstrikes on civilian homes where Hamas targets were believed to be present. The prioritisation of targeting individuals over consideration for civilian lives is deeply troubling.
Critics have condemned these tactics as inhumane, highlighting the devastating impact on innocent civilians. The reported use of AI to streamline targeting processes raises broader questions about the ethical implications of incorporating such technology into military operations.
The IDF has denied using AI to directly identify targets but acknowledges Lavender's role in cross-referencing intelligence. However, there is a difference between official statements and the accounts provided by intelligence officers.
Israel has faced ongoing criticism for the high civilian death toll in its operations, which have targeted residential areas, hospitals, and refugee camps. According to the IDF, Hamas routinely stationed military activities in residential neighbourhoods to use people's shields.
The IDF's targeting practices have sparked a fresh round of international condemnation after Israel killed seven World Central Kitchen relief workers in an airstrike in Gaza on Monday.
What is ‘Lavender’ AI?
In 2021, a book titled “The Human-Machine Team: How to Foster Collaboration Between Human and Artificial Intelligence to Transform Our World” was published in English, authored under the pseudonym “Brigadier General Y.S.”. Within its pages, the author — whom we have verified as the current commander of Israel's elite intelligence unit 8200 — argues for the development of a specialized machine capable of rapidly processing vast amounts of data to generate numerous potential targets for military strikes during times of conflict.
He contends that such technology would alleviate what he refers to as a “human bottleneck” in both target identification and decision-making processes necessary for approving said targets.
Leave A Comment