8200
Goldsmiths College, University of London, 2024
Title: 8200
Duration: indefinite
Credits:
Lawrence Livermore National Laboratory Archives, used under CC BY-NC-SA 4.0
Stills from Blame! (2017), Hiroyuki Seshita (dir.), Polygon Pictures/Netflix
Kanji training dataset
Franz Schubert String Quintet D. 956 (Casals, Stern, Tortelier, Katims, Schneider), public domain
+972 Magazine, Lavender: The AI machine directing Israel’s bombing spree in Gaza available here
This software work's proposition is that the histories of nuclear weapons and AI are deeply entangled: they are materially based in high-performance computation (centralised supercomputers like the Cray machines in the former case, and GPU-based hyperscale clusters in the latter); institutionally linked to the defence-academic-industrial complex; and while born of different historical conditions (e.g. the ruins of postwar Europe/US vs the post-2008 incarnation of capitalism's ever-imminent crisis), they both express embedded and seemingly-intractable geopolitical power hierarchies.
The most salient entry point for this work is an article (in +972, an Israeli magazine that is based on testimony of Israeli Defence Forces personnel) on the IDF's use of AI systems, called 'Lavender', 'Where's Daddy?', and 'The Gospel', for the operational and bureaucratic management of (the latest iteration) of the genocide in Gaza. The article gives detailed information on the workings and interactions of these systems, and the IDF's command-and-control procedures. It discusses the increased willingness (compared to previous conflicts) of the IDF to tolerate non-combatant deaths ('collateral damage') as they targeted alleged Hamas operatives sleeping in their homes. It also details the inaccuracy and high error rate of the algorithm, while acknowledging the difficulty of prosecuting a conflict where the line between combatant and non-combatant is blurred. This blurring happens adversarially (Hamas operatives are said to hide themselves and their assets within the civilian environment), but also at the level of semantics and definitions (children today become combatants tomorrow, whether in the Occupied Territories or in Israel).
This contemporary example of militarised technology is placed alongside and against archival footage of American atomic tests (which typically were conducted on Pacific islands or Native American ancestral land, and/or otherwise impacted, and continue to impact, those communities).
There is also a further tension here, at the level of software and production: the relevant archives are available under a creative commons license, much as current AI research relies on open source codebases and a vast trove of public and semi-public data. There is thus a mixed credo of openness and secrecy that was completely foundational (as much as private capital or the market) to the development of technology in the US, seen most saliently in the Internet (nee ARPANET).
The IDF's use of automated targeting systems is both very visible, thanks to +972's reporting, and comfortably distant for many in the US and Western Europe. However, there is no in-principle reason these systems can't be deployed closer to home - both for targeting, and as robotics improve, for automated interdiction. For more information, see here.