Inside Israel’s Bombing Campaign in Gaza

0
13


Since the war began in Gaza, more than six months ago, the Israeli magazine +972 has published some of the most penetrating reporting on the Israel Defense Forces’ conduct. In November, +972, along with the Hebrew publication Local Call, found that the I.D.F. had expanded the number of “legitimate” military targets, leading to a huge increase in civilian casualties. (As of this writing, more than thirty-two thousand Palestinians in Gaza have been killed.) Then earlier this month, +972 and Local Call released a long feature called “Lavender: The AI Machine Directing Israel’s Bombing Spree in Gaza.” The story revealed how the Israeli military had used the program to identify suspected militants, which in practice meant that tens of thousands of Palestinians had their homes marked as legitimate targets for bombing, with minimal human oversight. (In response to the “Lavender” article, the I.D.F. said that it “outright rejects the claim regarding any policy to kill tens of thousands of people in their homes.” The I.D.F. also said that, according to its rules, “analysts must conduct independent examinations” to verify the identification of targets. In an earlier statement, from November, the I.D.F. acknowledged “the use of automatic tools to produce targets at a fast pace.”)

The author of both stories was Yuval Abraham, an Israeli journalist and documentary filmmaker. Abraham co-directed the documentary “No Other Land,” about the daily struggles of Palestinians in the West Bank. During his acceptance speech upon winning the award for Best Documentary at the Berlin International Film Festival, in February, Abraham called for a ceasefire in Gaza. In response, he and his family in Israel received death threats. Many people in Germany took issue with the speech, too—including Berlin’s mayor, who implied that it was antisemitic.

I recently spoke by phone with Abraham. During our conversation, which has been edited for length and clarity, we discussed how Israel’s command structure has been making decisions during the war, why military sources have been giving Abraham so much information, and his experience—as a man who had family members die in the Holocaust—of being accused of antisemitism in Berlin.

As a longtime critic of the Israeli military, how do you think that its decision-making about civilian lives has changed since October 7th?

If you examine the military’s conduct in 2021 in Gaza, in the border protests that happened in 2018 and 2019, and in the 2014 war, there was always quite a disregard for Palestinian civilian life and very little accountability for crimes and alleged crimes that were committed by soldiers in Gaza. In 2014, more than five hundred Palestinian children were killed and the military said that it would investigate what happened. There were many cases opened, blaming soldiers and I.D.F. policies, and in the end just a single case resulted in indictment and it was about three soldiers who committed or assisted in acts of looting. This military is not holding itself accountable, obviously, which is why I think a lot of people are hoping that external courts, such as the I.C.C. and I.C.J., could hold the military accountable.

But I would say that the main changes in this operation were the scope and how automated things became, and the automation is very much related to the scope. The military has this term called “human targets.” What it meant in the past was that these are particular individuals, who are senior-ranking commanders in Hamas or Islamic Jihad military wings, and that because of their military importance, the I.D.F. international-law departments made a decision that civilians can be killed alongside these people. So usually what this means is that they will be bombed inside their houses, killing not only them but often entire families in the process. And this, sources told me, used to be quite a small list because it’s quite a brutal way to kill somebody. You’re dropping a bomb and destroying the entire house that that person is in.

After October 7th, higher-ups—and we don’t know if they were people in the military or on the political side—made quite an unprecedented decision to mark everybody in those military wings of Hamas and Islamic Jihad as human targets, meaning that going forward, anybody in those groups, regardless of age, regardless of military importance, not only can we bomb them but we can bomb them with civilians present.

And this posed a technical problem for the military because when you’re working on a small list of these so-called human targets, you need to answer four questions. You need to prove that the individual really belongs to a Hamas or Islamic Jihad military wing. You need to prove where their house is. You need to prove how they are communicating with the world. And then in real time, you have to know when they entered their house so you can bomb it. When the list was small, human beings could do that. When they decided to expand the list to so many people, it became impossible, and that’s why they decided to rely on all of these sophisticated automation and artificial-intelligence tools, and the results of that are horrific.

These machines got it wrong many times, meaning that the push toward this extremely wide scope necessarily meant that they’re also erroneously targeting civilians, and because of the minimal supervision that was in place, they knew that they would not be able to prevent this. And the second thing is that when you take a policy of bombing houses and killing entire families in order to try to kill one senior target—it was already very controversial and very dubious under international law. But then they apply it in such a broad way, to include alleged low-ranking militants. One source said they called these targets “garbage targets.” We know they are not important from a military point of view, and yet we’re bombing the house. We’re killing a family.

So the decision was made at some higher level to put more people on these lists, but the only way to practically carry it out was the A.I.?

Yeah. I would say that that’s an accurate description. It’s based on machine learning, which is a subset of artificial intelligence. That’s how these machines were trained. They literally relied on it to determine whether human beings get to live or die, to determine whether an individual could be marked for assassination. And yes, if you are going to decide that you want to mark more than thirty thousand people for assassination in their houses and you want to know in real time when those thirty thousand people enter their houses, there is no other way to do it other than using automation and A.I. Yes.

One source put it to me like this. He said that the government wanted to be able to tell the Israeli public, “We killed a very high number of Hamas operatives.” That’s the goal. And then, in practice, again, the way these systems were actually used is that a large number of the people who were being killed were not strictly Hamas operatives. They were either loosely related to the military or completely unrelated.

There’s been some reporting in Haaretz implying that commanders on the ground have leeway to make targeting decisions. Is that your sense as well?

In our reporting, we say that this machine, Lavender, which marked thirty-seven thousand Palestinians in Gaza as suspected militants, had a supervision protocol in certain areas of the I.D.F. of only checking whether the selected target is a male or a female. Intelligence officers were told that if it’s a female, then the machine made a mistake for sure because there are no women in the military wing, but if it’s a male, they were allowed to incriminate the target—meaning to mark that person for assassination—without going in depth and checking why the machine made the decision that it made.

Some sources lower down the chain of command said that they thought the protocol was so outrageous that they went against it. The more low-ranking officers said, “Well, we have to check a little bit more because we’re killing civilians as targets.” That’s an example of a policy that to me seems like a potential war crime that came from above. Then again, as you said, and as Haaretz has reported, it seems that there are a lot of policies that are from the bottom and the higher-ups are unaware of what is going on.

Sources that I spoke with said that during the first weeks of the war, for the low-ranking militants that they were bombing inside their houses, they were allowed to kill about fifteen civilians. And for the senior commanders in Hamas, the number of civilians was, on several occasions, more than a hundred. In the past, these high numbers would have to be approved by the Army’s international-law department. That’s how it works. But I’ve heard from sources that you would have a particular commander who was very trigger-happy and would authorize high numbers without reporting it to the law department.

LEAVE A REPLY

Please enter your comment!
Please enter your name here