At the literary festival, people complimented my beaded watermelon earrings. I said in response that I wished I could wear a shirt every day instead that says “WHAT ARE WE FUCKING DOING, SERIOUSLY?” Maybe I should. Most seemed to understand what I meant by the aggression of that hypothetical shirt, my mixed feelings on the fruitlessness of wearing such a small symbol, wearing it to yet another event where the word “Palestine” wouldn’t be uttered from stages occupied by artists.
That was last weekend, and I was still processing the news I’d heard a week before that. A friend texted to ask if I’d heard the Democracy Now! interview about Israel’s “Lavender” and “Where’s Daddy?” AI programs. I hadn’t, but I already had a stressful drive to make, north on unplowed roads, so I queued it up to get all my daily cortisol in one shot.
The way Amy Goodman’s familiar voice curdled around the phrase “Where’s Daddy?” still echoes when I think about this murderous software now. When I heard about it on the radio that day, all I could think was how perverse it seemed to name an assassination program (built to follow alleged Hamas operatives to their homes) “Where’s Daddy.” The collected images flashed: hundreds of men, some perhaps derisively called “daddy” by an occupying force, holding their own dead children, siblings, nieces, grandchildren, wives, and friends.
It’s rare for me to feel shaken by state violence in a way that’s novel. An unfortunate consequence of my long-term interest in studying the worst of organized human behavior, I suppose. But hearing the facts of this new wave of machine-human collaboration in state-sanctioned murder shook me. It belongs on a list alongside the slave ship, the concentration camp, the atomic bomb, Agent Orange, and unmanned drones—the technological innovations that have empowered humans, mad with power, to enact violence at a more-than-human scale.
I highly recommend you read the entire +972 story about Lavender, but I will also summarize the most shocking points here, in case you don’t have 8,000 words in you right now. Writer Yuval Abraham (who also received an absurd level of backlash for this speech while accepting his documentary award at the Berlinale this year) reveals in the article that “…Lavender — which was developed to create human targets in the current war — has marked some 37,000 Palestinians as suspected ‘Hamas militants,’ most of them junior, for assassination.” The sources, Israeli intelligence operatives, go on to reveal that they were given an unprecedented level of approved “collateral damage” to use in conjunction with Lavender’s analysis of targets.
For each alleged operative identified by the AI, they were allowed between 5 and 100 civilian casualties. They often repeated 20 as an average rate—20 “collateral” deaths per 1 possible “enemy combatant.” At this rate, that means a total of 740,000 “acceptable” casualties could result from the targeting of the 37,000 Hamas “operatives” identified by Lavender. 740,000 Palestinians in Gaza “rubber stamped” for a military-approved, AI-driven death. That’s nearly a third of Gaza’s total population.
The hypothetical end-scale of this destruction a) underlines beyond a shadow of a doubt the, yes, genocidal intent of the Israeli government, and b) renders absurd their official claim that “…our war is against Hamas, not against the people of Gaza.” Even if you don’t accept that an occupied people have a right to resist through all available means (I happen to think they do), you have to wonder what it means to be “part of the military wing of Hamas” when the 15, or 20, or 100 people who happen to, maybe, probably, statistically, be nearby at a given moment “count” as targets, too. If nothing else, I hope that anyone still reading will go on to interrupt the long-problematic language of the “human shields” narrative, should they hear it in the future.
Among those 37,000 targeted for assassination, “…it was known in advance that 10 percent of the human targets slated for assassination were not members of the Hamas military wing at all.” The Israeli sources interviewed shared how little oversight the program was given by human controls, all in the name of “saved time” and “efficiency.” One describes his only role in the process (the rest of which, including surveillance, analysis, target rating, and weapon allotment, was outsourced to automated, AI-driven systems) was to confirm that an identified target was male. Since there are no women fighters in Hamas, the confirmation of sex was enough to press ‘ok,’ dispatching a drone, a bomb, or a tank to the home of the person identified, again, largely by automation alone.
Of course, there are precedents for this kind of broad targeting, particularly within recent memory during the sprawling Global War on Terror. It’s not uncommon for all young men in an identified area to be treated as combatants, but the automation of the process, along with the increased surveillance powers available to military powers in this era, further dehumanizes us all. Anyone who’s been part of a targeted social movement, especially in the post-9/11 era, (see Stop Cop City for a useful recent example) or a targeted racial demographic (see Stop and Frisk*, for one) knows the risks that emerge when a powerful, militarized force attempts to categorize human beings and predict our behavior. Yet, we normalize mass surveillance every day in our passive acceptance of tech oligarchy.
As the friend who texted me to talk about this story put it, “I just feel angry and out of control, implicated and complicit…that’s U.S. technology…They are slapping us in the face with our own vulnerability.”
I should clarify that we don’t know for sure that it is U.S. technology at work in the cloud powering Lavender and Where’s Daddy. We do know that American companies have provided material support for the state of Israel and that, according to Google’s own employees, it’s not possible to know exactly what a client uses computing services for once they’re delivered, particularly not in the case of dedicated and highly secure systems. This week, 28 workers were fired after a No Tech for Apartheid day of action, and the first confirmed contract between Google and the Israel Defense Ministry was released. “Don’t be evil,” indeed.
The irony of the release of the Lavender story by +972 is that the iteration of the program the writers revealed is no longer particularly active. They essentially put themselves out of work by how “efficiently” they used the system in the first, and most deadly, weeks of the siege. They write, “The fact that most homes in the Gaza Strip were already destroyed or damaged, and almost the entire population has been displaced…impaired the army’s ability to rely on intelligence databases and automated house-locating programs.” So many family homes were bombed, there are no physical houses left to bomb. The scale of devastation is nearly impossible to grasp. No wonder everything looks gray, completely otherworldly, in the photos we see now.
Back at the literary festival, one of my favorite writers, Carmen Maria Machado, read from a story about fractured timelines and the people who could jump between them while retaining a coherent, single self. It was weird, and sad, and kind of hot—all the reasons I love this writer. As her character slipped through slightly altered parallel worlds, I found myself laughing a little too loud, before the punch line, or when no one else seemed to be laughing. Maybe it was too real in a moment where I often feel like part of me (the part that paid taxes) is across the ocean destroying entire worlds, another is watching it happen on a tiny screen that watches me back, and another is having a pretty pleasant spring afternoon in the park.
We went to the park before the reading, the park I used to get obliterated at as a teen and play at as a child, and sat in the too-green grass among families and picnic blanket girls and Frisbee Christians. I read a little, but it was a distracting scene. Before long, I heard the vjiiii, vjiiii sound of a drone nearby. Two young men (boys?) were flying it from the nearby tree line. I heard a little girl scream and looked up to see her running away on clumsy, three-year-old legs from where the pilots stood. I hadn’t noticed her family sitting right behind us until she ran back toward the safety of their blanket.
Oddly, my park companion and I had just had a conversation about drones in the car, and how we hated them. Once, we’d been sunbathing on a rock at a peaceful lake when one buzzed from around the corner and hovered. I threw rocks at it until it went away. I heard that Israeli quadcopters have been playing recordings of crying children, among other sounds, into the ruined streets of Gaza to lure people out into the open to kill. Like how a cougar sounds like a crying baby if you’re unlucky enough to hear it in the woods. What the fuck are we doing, seriously?, my shirt from another timeline asks.
The drone accelerated back up into the air and the little girl laughed and ran towards the tree line again, eager to be scared by what might, in another world, just be a toy.
*for further reading on the surveillance connections between U.S. policing and the Israeli occupation, see the work of Deadly Exchange.