Nov 06, 2024
Intelligent Genocide: The War of the "Advanced" World Against the Palestinians and the Lebanese - Elham Barjas
Elham Barjas
Researcher

Click here for bio and publications
Elham Barjas

Intelligent Genocide: The War of the "Advanced" World Against the Palestinians and the Lebanese - Elham Barjas

 


A year and three weeks have passed since the start of the genocide against the Palestinian people. Over time, the scene itself and the reactions against it have become automatic, meaning spontaneity and clarity in positions. Those who are against it cling to their position and repeat it. Those who are for it do the same from the perspective of justification. All the while, the Israelis continue to kill, and Palestinian casualties increase. This mechanism is a logical extension of the fact that the battle itself is being fought by machines, surpassing human knowledge and ability to understand the scale of the crime, comprehend it, and punish it.



On the other hand, the ethics of AI and the international laws regulating its work and industry in the public and private sectors are currently being developed. Nonetheless, Israel is engaged in this ethical and legal workshop despite its genocidal war in Palestine. As an expression of their "credibility," the Israelis have taught the AI ​​systems that manage the genocide that women are not military targets, leaving it to human elements to kill them and their children as collateral damage.



On September 5, 2024, Israel joined the EU, the US, the UK, and others in signing the first international treaty on artificial intelligence (AI) and human rights.[1] The treaty was developed and drafted within the framework of the European Council and other countries, including "Israel," to set rules and obligations for the "responsible" use of AI, with a focus on protecting human rights, democracy, and the rule of law.



The launch of this agreement coincides with Israel's continued genocide against the Palestinians in the Gaza Strip and its expansion to include Lebanese territories. The ethnic cleansing included the residents of southern Lebanon and parts of Beirut and the Bekaa region with a Shiite Muslim majority. Israel is committing its crimes against humanity with the support of the major countries in the EU and the USA, which provide it with money, weapons, ammunition, and plenty of international political support. Israel also signed a nascent agreement to develop AI technologies with the UK, which cost the latter 1.7 million pounds.[2]


The European Convention defines AI as a "machine-based system that, for explicit or implicit objectives, infers, from the input it receives, how to generate outputs such as predictions, content, recommendations or decisions that may influence physical or virtual environments. Different artificial intelligence systems vary in their levels of autonomy and adaptiveness after deployment."



The countries that drafted and signed the agreement do not hide their concern "that certain activities within the lifecycle of artificial intelligence systems may undermine human dignity and individual autonomy, human rights, democracy and the rule of law," precisely these technologies' "potential effect of creating or aggravating inequalities, including those experienced by women and individuals in vulnerable situations, regarding the enjoyment of their human rights and their full, equal and effective participation in economic, social, cultural and political affairs."



Hence, the agreement was developed to avoid "the misuse of artificial intelligence systems and opposing the use of such systems for repressive purposes in violation of international human rights law, including through arbitrary or unlawful surveillance and censorship practices that erode privacy and individual autonomy."



Of course, since every rule has its exceptions, the rule that “Israel” appended to its signature included "matters relating to national defense" as an unconditional exception, as these matters "do not fall within the scope of this Convention."



This first-of-a-kind agreement was developed and entered into force in the context of an extended Israeli effort to create AI systems dedicated to increasing the number of military targets to thousands per day. The process is based on data surveys of 2.3 million residents of Gaza, the total violation of their privacy, and the illegal use of their data for several years.


While work on drafting the agreement began in 2019, a book entitled The Human-Machine Team: How to Create a Synergy between Humans and Artificial Intelligence That Will Change Our World was published in 2021. It was written by the current commander of the elite Israeli intelligence unit 8200 under a pseudonym. He presents his argument for designing "a special machine that could rapidly process massive amounts of data to generate thousands of potential 'targets' for military strikes in the heat of a war."



According to the book's author, the technology would "resolve what he described as a 'human bottleneck for both locating the new targets and decision-making to approve the targets.'"[3] The book was published the year before the committee that drafted the Framework Convention on Artificial Intelligence and Human Rights was formed in the Council of Europe. Also that year, in October 2022, the UN Human Rights Council passed a resolution emphasizing the centrality of human decisions in using force and warning against reliance on unrepresentative data sets, algorithms, and machine learning processes.


Since October 7 [2023], Israel has been using the Lavender system. According to the +972 Magazine article, it is precisely what The Human-Machine Team described. The other systems used by Israel are called The Gospel and the Where's Daddy?, in a very crude expression for targeting and killing men in their homes with their families.[4]



?Does Lavender Protect Women



The most famous of the systems, Lavender analyzes information collected over the past years on 2.3 million residents of the Gaza Strip using a mass surveillance system. Each person is given a rating from 1 to 100 on the likelihood of being a member of Hamas or the Palestinian Islamic Jihad movement. The Gospel system identifies military targets, including underground targets such as tunnels, the homes of families of people designated as targets by Israel, and "vital targets." The latter are targets that would "shock" civilians into pressuring on Hamas. Based on Lavender’s classifications and location determinations made by The Gospel, the “target” is finally spotted at home. Consequently, the house is bombed with everyone inside and usually at night to ensure elimination, using dumb bombs so as not to spend vast sums of money on "unimportant" targets.[1]


The "targets" are usually anyone who belongs to Hamas or the Palestinian Islamic Jihad. The AI ​​identifies these targets based on the information it is provided with, based on the data surveys that Israel has illegally conducted in Gaza, and on Israeli assumptions about Hamas fighters' action mechanisms. Accordingly, anyone who has worn any uniform belonging to one of the organizations in the past two years is now classified by the machine as a military target.



In another example, Lavender tracks work and communication methods. Since the Israelis assume that specific devices are used for communications, anyone who uses the same method of communication is considered a military target by Lavender, including police and civil defense personnel. According to the +972 Magazine investigation, the Israeli military treats Lavender's information as human orders. The orders are executed without verifying that they match the actual situation.


Lavender is also influenced by Israel's broad definitions of terrorism and terrorist groups.[2] It relies on general provisions in Israeli military law in the occupied Palestinian territories to ban "hostile organizations," including Palestinian human rights organizations, designating them as"terrorist groups" in the West Bank. Palestinians are detained simply for their membership or association with these organizations or close entities. In other words, members of these associations can be identified by Lavender’s system as military targets, which entails accepting that they are bombed and killed inside their homes along with their families.


The algorithms used by Israeli AI systems in war defy International Humanitarian Law (IHL). They disregard its fundamental principles, using gender as a targeting criterion without caring about collateral damage. The Israelis have taught their AI systems everything they need to recognize Palestinians as military targets, and they follow the machine’s orders to bolster their claim to accuracy and the respect of IHL standards, which they have not taught their machines, by the way. The only thing their machines are taught is "gender sensitivity."


A witness tells the investigation that when the target appears to be a woman, the Israelis assume that the machine made a mistake based on another assumption that Hamas and Islamic Jihad do not recruit women. In any case, the investigation continues, the Israelis do not spend more than 20 seconds to verify whether the target is a woman or a man. In both cases, they ultimately target the persons at home, among their family and children. Accordingly, and since gender is the only criterion by which a military target is excluded, every Palestinian man in Gaza can be classified by the machine as a military target to be eliminated, along with all his family.[3]


Thanks to the precise targeting mechanism followed by the Israeli occupation army, the number of women and children war victims in Gaza is many times more than the highest rate of women killed in armed conflicts since 2004.


In a report published by Oxfam at the end of September 2024, the highest number of women killed in a year due to direct conflicts between 2004 and 2021 reached 2600 women killed in Iraq in 2016.[4] On April 16, 2024, UN Women said that "among the women killed in Gaza, six thousand mothers orphaned 19 thousand children due to their killing."[5] In the latest figures published on the Palestinian Central Bureau of Statistics page,[6] the number of victims in Gaza reached 11,585 women and 17,029 children, or 67% of all victims. Adding the elderly, medical personnel, and journalists, the total collateral damage to eliminate the precise targets identified by AI exceeds 75% of the war's victims.


The Israeli extermination machine kills women and children in cold blood, holding their fathers, husbands, and brothers responsible for its non-random classifications. It chooses to fight with dumb bombs from a distance and manipulate the standard of proportionality to the maximum extent, hiding behind AI.


In parallel with its genocide, and along with the US, the EU, and the UK, Israel is making a mockery of the human rights system, gender issues, equality, and justice. It exploits gender sensitivity to clean up the image of its machines and AI systems while leaving its soldiers free to kill Palestinian women en masse by conscious human decision. The machines are used explicitly to multiply targets by the thousands daily, claiming they are precise and only hit the specified targets.


While Israel, along with the EU, the US, and the UK, continues to mock the human rights system, we, on the other hand, remain committed to this system. We shall defend it and try to protect what remains.






    1. The Framework Convention on Artificial Intelligence

    2. Landmark agreement with Israel takes UK global science mission to new heights

    3. Abraham, Y. (2024, April 3). ‘Lavender’: The AI machine directing Israel’s bombing spree in Gaza." +972 Magazine.

    4. Human Rights Watch. (2024, September 10) "Gaza: Israeli Military’s Digital Tools Risk Civilian Harm."

    5. Abraham, Y. (2024).

    6. Human Rights Watch (2024).

    7. Abraham, Y. (2024).

    8. Oxfam International. (2024, September 30). "More women and children killed in Gaza by Israeli military than any other recent conflict in a single year – Oxfam."

    9. UN Women. (2024, April 16). "The War on Gaza Is a War Against

    10. Women [in Arabic]."

    هيئة الأمم المتحدة للمرأة، "الحرب على غزة هي حرب على النساء،" 16 نيسان/أبريل 11. 2024.

    12. Palestinian Central Bureau of Statistics dashboard, based on numbers published at the time of writing on 24/10/2024.



    Recent publications
    Dec 16, 2024
    The situation in Sudan after the outbreak of war: Introduction
    Dec 16, 2024
    The impact of war on the political situation and the path of democratic transformation in Sudan - Dr. Mohamed Ibrahim Al-Hassan