Article | Open Access
Training for the Algorithmic Machine
Views: | 2324 | | | Downloads: | 1175 |
Abstract: In thinking about the ubiquity of algorithmic surveillance and the ways our presence in front of a camera has become engaged with the algorithmic logics of testing and replicating, this project summons Walter Benjamin’s seminal piece The Work of Art in the Age of Its Technological Reproducibility with its three versions, which was published in the United States under the editorial direction of Theodore Adorno. More specifically, it highlights two of the many ways in which the first and second versions of Benjamin’s influential essay on technology and culture resonate with questions of photography and art in the context of facial recognition technologies and algorithmic culture more broadly. First, Benjamin provides a critical lens for understanding the role of uniqueness and replication in a technocratic system. Second, he proposes an analytical framework for thinking about our response to visual surveillance through notions of training and performing a constructed identity—hence, being intentional about the ways we visually present ourselves. These two conceptual frameworks help to articulate our unease with a technology that trains itself using our everyday digital images in order to create unique identities that further aggregate into elaborate typologies and to think through a number of artistic responses that have challenged the ubiquity of algorithmic surveillance. Taking on Benjamin’s conceptual apparatus and his call for understanding the politics of art, I focus on two projects that powerfully critique algorithmic surveillance. Leo Selvaggio’s URME (you are me) Personal Surveillance Identity Prosthetic offers a critical lens through the adoption of algorithmically defined three-dimensional printed faces as performative prosthetics designed to be read and assessed by an algorithm. Kate Crawford and Trevor Paglen’s project Training Humans is the first major exhibition to display a collection of photographs used to train an algorithm as well as the classificatory labels applied to them both by artificial intelligence and by the freelance employees hired to sort through these images.
Keywords: algorithmic culture; artificial intelligence; critical theory; facial recognition; selfies; surveillance; technological reproducibility
Published:
© Stefka Hristova. This is an open access article distributed under the terms of the Creative Commons Attribution 4.0 license (http://creativecommons.org/licenses/by/4.0), which permits any use, distribution, and reproduction of the work without further permission provided the original author(s) and source are credited.