Disrupt by the Rules. Interview with Florian Cramer

Human prejudices reemerge in algorithmic cultures allegedly devised to be blind to them. Algorithmic identity politics reinstate old forms of social segregation—in a digital world, identity politics is pattern discrimination. It is by recognizing patterns in input data that artificial intelligence algorithms create bias and practice racial exclusions thereby inscribing power relations into media.

These are just some of the questions explored by Clemens Apprich, Wendy Hui Kyong Chun, Hito Steyerl and Florian Cramer on Pattern Discrimination, a new book published by University of Minnesota Press in November 2018. After reading it, I wanted to learn more and got in contact with Florian Cramer, applied research professor at Rotterdam University of Applied Sciences and author of the essay What Is Post-Digital.

You can order the book here.

Since the very first line, you make clear the discussion about big data and problematic algorithms regards a problem old as Western civilization – hermeneutics and how we can interpret, translate content from a language to another in order to make sense out of it. Do you think the whole questions is tied to the very nature of alphabet? I am referring to the representation of phonemes that, before any further translating acts, is supposed to translate specific sounds in letters. I wonder if the same questions concerning discriminating algorithms would have been risen if computers were developed by a culture that wasn’t based on Greek alphabet.

The alphabet is just one example of a way of turning information into data. Numbers are another one, Western classical musical score notation yet another one. Historically, it took centuries for alphabets to become sufficiently standardized and formalized to be forth-and-back transcodable into numbers. For computability, the type of encoding however doesn’t matter – whether it’s the Greek or the Latin alphabet, Arabic abjad or Japanese hiragana. It doesn’t even matter for computability whether a computer internally computes with binary, ternary, decimal or hexadecimal numbers. All that is needed for any type of computational data mining is a way of encoding information (no matter whether words, images, sounds, temperatures, moisture…) into numbers so that calculations can be performed on them. Typically (even in A.I. Deep Learning), these are statistical calculations. 
Such language computation existed before the Greek antiquity, in Middle Eastern gematric mysticism and magic. In gematria, numerical values – or, in today’s computer engineering terminology: “checksums” – were calculated for names of gods and emperors to show their equivalence to other words and names with the same numerical values. The German linguist Franz Dornseiff covers this history in his 1922 book “Das Alphabet in Mystik und Magie” [“The Alphabet in Mysticism and Magic”] which is freely available on archive.org. In later centuries, gematria became a standard technique in the Jewish and later Christian Kabbalah. 

You make a very interesting connection between Fluxus performances, more specifically Counting Songs (1962) by Emmett Williams, and the roles played by developers and users in nowadays digital programs. The audience and the users share the same belief they are engaging in a participatory event while, in fact, the artists-programmers have all the ties in their hands and are able to pull and modulate them whenever they like. These performances and digital programs are designed to make sense out of chaos, to build a frame wherein what happens is ruled by rules. This reminded me the definition of ‘play’ by Johan Huizinga in his book “Homo Ludens” (1938): “it proceeds within its own proper boundaries of time and space according to fixed rules and in an orderly manner. It promotes the formation of social groupings which tend to surround themselves with secrecy and to stress their difference from the common world by disguise or other means”. This means players have to agree together to play by shared rules. Is it even possible to choose to not take part in big data games?

Huizinga’s theory is a highly relevant addition to my contribution to the book, if not an omission on my part. If I were still in the writing phase, I would incorporate it. Yes, indeed, one might look at contemporary social media (which, next to the usual suspects of YouTube, Twitter, Facebook and company, also include Massively Multiplayer Online Games such as World of Warcraft) as games that aim to remove their “proper boundaries of time and space”, for the simple  reasons of growing market share and company value. 
However, one may argue that such boundary-defying games preexisted computing, the Internet and big data. Examples include monetary systems, stock markets, insurance systems and telephone networks. From a certain size, they turn from “opt-in” to “opt-out” games. Often, opting out means to break with society or live at its fringes. This can manifest itself as a romanticism of non-participation that can be dated back to Henry David Thoreau, Diogenes, Chuang Tzu, and whose contemporary practitioners in the arts included Pete Horobin (a.k.a. Marshall Anderson a.k.a. Peter Haining a.k.a. Ae Phor), Heath Bunting and Goodiepal, at least at certain points of their work and life. It also includes the Unabomber, preppers and right-wing militias. 
Since the question of participation is ultimately social and political, the question concerning these games is – in my opinion – not so much whether they should or better shouldn’t exist, but who designs and controls them. From that perspective, the contemporary big data games simply boil down to a massive privatization of the public sphere. Yet one should be careful not to simply blame this development on the technology, since it is structurally, politically and economically no different from the privatization of other public spheres such as public transport systems, energy, water and social housing services, and the conversion of public ground into shopping malls and gated communities. 
When the Fluxus artists performed their Counting Songs, they of course had no idea that they acted as an avant-garde of the “soft”, “nudging” and “gamified” people control techniques in 21st century privatized public spheres. They shared this fate with their contemporaries from the Situationist International – whose concept of a ludic urbanism had been directly inspired by Huizinga and conversely inspired contemporary shopping mall design.[*]

The Fluxus performance stops to be whenever a participant doesn’t follow the rules as much as a football match is interrupted when a kicker takes the ball in his hands. How can we step out the game when the referee (the programmer, the artist) keeps changing the rules in order to extend the playing field to more and more areas of our life?

Football is a good example, because that game, too, is subject to modification of rules: the Video Assistant Referee introduced at the last world cup, the Three Points for a Win rule introduced in 1995, the question of matchdays being spread over several calendar days etc. These rule changes have always been introduced by the ‘programmers’ (i.e. FIFA, football leagues), often resulting in intense conflict with fan groups. The only answer to that is democratization: that the players participate by collectively owning the game and having a vote in the decision-making over its rules. Wikipedia/Wikimedia and a number of Open Source software projects (such as Debian) demonstrate how this can work, although it’s far from a panacea and comes with its own set of issues. 

The way the world is translated in numbers and raw data reminds me how, to quote Karl Popper, “Greek philosophers had viewed the world as a huge edifice of which the material things were the building materials” (“The Open Society and its Enemies”, 1945) up until Heraclitus and his revolutionary concept of change. According to this view, there can’t be any real developments in history. Everything has been already done and we are just remixing and repeating what happened in the past. If algorithms are used to calculate the past in order to produce data that can’t overcome their human-based biased origins, should we take the use of big data as the continuation of this ancient attitude?

What you describe is definitely true for Deep Learning, which nowadays is commonly (but incorrectly) used as synonymous with Artificial Intelligence as a whole. We are currently in the beginning of an A.I./Deep Learning hype cycle at whose end people will, hopefully, see the limitations of that technology, precisely on the grounds that you describe. In my own contribution to the book, this is all I wanted to get across, using examples that better speak to the field where I’m working in, arts and design.  

Do you think it is even possible to develop systems to question our world that are not biased by our own views?

Simple answer: no.

[*] The original 1990s/2000s design concepts of “cultural probes”, “service design” and “gamification” explicitly reference Situationism.

Book page

Add a Comment

Your email address will not be published. Required fields are marked *