How do automated systems respond to play?
All of our work at UKAI Projects involves other creative people as collaborators and most of the time the process can feel like being ten years old and walking up to someone on the playground and offering up the universal invitation of “wanna play”. I find ‘play’ so interesting because it is an activity that lacks an immediate objective (by definition).
We’ve long recognized that many mammals engage in play and research suggests that the activity extends to reptiles and birds. The story of a female Komodo dragon in New York removing a handkerchief from a zookeeper’s pocket creates a wonderful tension between how we imagine the inner life of animals and the behaviors we might observe under the right conditions.
One of the core assumptions of UKAI’s work is that technologies such as artificial intelligence are not inherently dangerous or harmful, but rather become so when their development and distribution are in service to a narrow set of moral positions. Specifically, by prioritizing efficiency and growth as ends unto themselves, we create conditions where we become alienated from our stories, our landscapes, and our own bodies as well as the bodies of those around us.
Play, with its explicit absence of objective, might then become a site of exploration and resistance to the automation of everyday life. Play disorders the assumptions of systems and spaces that relentlessly mold our thoughts and actions toward goals buried deep in socio-technical systems.
Human beings are remarkably adept at reading the ‘rules’ of a particular context. We pay attention to rewards, praise, symbols, and social cues and map out the expectations that a particular space has for us. In some cases, those rules are enforced through threats of institutionalized violence. However, when we have a choice in whether to engage with a space or not, certain categories of response can be seen.
Compliance: We adapt to the ‘rules’ of the space. This may be a result of choice (I want what this system offers), habit (I comply without reflection), fear (I want to avoid the consequences of not adapting), or a combination therein.
Avoidance: We go out of our way to avoid these spaces. A 2018 project I was involved in explored how the mechanics of traditional financial institutions can make Newcomer and racialized communities unwelcome. Formal rules around credit and employment history and informal rules around trust and language ensure that many will avoid banks and seek out solutions to financial challenges in other ways. Non-traditional financial solutions such as money lenders emerge to fill the gap and vulnerable people are willing to pay usurious interest rates to be treated with dignity and care.
Resistance: We go out of our way to upset the operating system of these spaces. I often embarrass my family when responding to contexts that feel intentionally dehumanizing or manipulative, such as telemarketing calls or addressing errors made by a corporation from which I purchase services. Resistance is not equally available to all, and can show up in many ways. Some argue that the Soviet economy was undermined by millions of small acts of resistance.
Play, however, might represent a fourth category of response. Play suggests an indifference to the desires of a system or process and doesn’t posit an alternative (again, by definition). Play offers little in the way of meaningful data for an automated system. Compliance confirms the systems design. Avoidance will exist as an absence or gap in the data. Resistance becomes something to be absorbed or weeded out. Automated systems, whether technological or bureaucratic, learn little from play and I am curious whether it may then be useful as a way to undermine the growing hegemony of algorithmic culture.
In 2022, in my role as a program lead for the Goethe-Institut Toronto, we brought together four experts who explore questions around technology and culture. The structure was ostensibly a podcast, and a set of predictable mechanics have come to define these experiences. Experts share their expertise and thoughts about the issues, drawing on an intellectual history appropriate to their discipline.
Rather than falling back on the well-known mechanics of these types of conversations, we opted to try something a bit stranger.
For just over four hours, these experts played a role-playing game modeled on the revised version of the 1984 Paranoia! game, which is set in a world where AI has taken over and things are both cheerful and terrible. The experience was an unsettling one for both players and organizers. There was plenty of confusion and laughter and some truly wondrous moments that extended from the process of play rather than from the available knowledge of the assembled experts. The edited recordings will be available in the coming months, and we are excited to get your feedback.
The role-playing game is just one experiment in what we hope will be many as we test out the idea that activities such as play, explicitly lacking an objective, might complicate and interrupt the automation of everyday life. We also think that how an automated system responds to ‘play’ provides insight into how it was built and organized and therefore how it might be evaluated and critiqued. Can any system that resists or punishes ‘play’ be considered truly human? Do we want to turn over aspects of life to machines if they are unable to make space for joy without an associated goal attached to it?