September 2, 2020, by Brigitte Nerlich
Mutant algorithms
I was talking to a friend in the United States. She told me the story of a friend who normally just talks about motherhood and apple pie, but suddenly wondered about algorithms. So, my friend asked me how I would explain algorithms. That reminded me of discussions I had a year or so ago with my mum, when she was still alive, who had asked me to explain what an algorithm was (the context was facebook, YouTube and Brexit). I remember trying, but mainly failing.
Algorithms are now all around us in machines but also in language, it seems.
A-levels and algorithms
Just recently algorithms have reared their head again in the context of the ‘A-level controversy’ here in the UK when those who were supposed to oversee standards of grading in schools were also asked to avoid grade inflation. So they devised an automated decision procedure, or algorithm, that would do this without them having to look at millions of exam results. The end result was a disaster, as the algorithm revealed exactly what has been going on for years here in the UK and probably elsewhere, namely that pupils with privileges get better grades than those without.
The result of an obscure human decision-chain combined with a less obscure machine decision-making procedure was that many students had their results down-graded, especially those from less affluent areas/backgrounds. In the end, after a general outcry, these results were scrapped. Instead, human decisions by teachers, biased towards giving students the benefit of the doubt, were given the precedent over machine decisions, biased towards not doing that.
However, instead of allocating blame to the humans, from the Prime Minister down, who made this happen, the last piece in the decision-chain, the algorithm, was blamed for this disaster. That is not surprising. This sort of thing has happened for a long time, as it is easier to blame an inert machine than a reasoning human being, and it is easier to blame something you don’t understand than somebody who should understand.
So, I tweeted on August 14: “But blaming ‘the algorithm’ doesn’t help. It distracts from the biases that we humans live by and that the algorithm reveals. Focus on the humans and see how and whether THEY care for other humans.”
Did that happen? No, of course not. What happened was however entirely unexpected. Blame was attributed not only to an algorithm, but a ‘mutant algorithm’, evoking images of a Frankensteinian monster stalking A-level grades, or genetically modified A-level grades, if you want to be more modern. Nothing good anyway. As at letter to the Huddersfield Daily Examiner said on 28 August: “When I first saw the phrase ‘mutant algorithm’, my initial reaction was to wonder whether or not we were under attack by aliens from outer space.” Artificial intelligence meets aliens. Nothing to do with humans and human intelligence or the lack thereof.
The first use of ‘mutant algorithm’ in the news media
As I am at the moment struggling with mutant bacteria in my gut, I thought it would be opportune to look into this other mutant for my amusement and distraction.
I opened Lexis Nexis, the news database and searched for “mutant algorithm”. There were almost 150 hits, all related to the A-level fiasco. But there was one from before! And that was interesting. The first attestation of ‘mutant algorithm’ on Lexis Nexis was from 2010, published in Investor’s Business Daily and the topic was volatility in the (stock) market (which is, as we all know, governed by algorithms). The article says:
“More inexplicable was the action of certain individual stocks. The sell-offs and run-ups in blue chips such as Procter & Gamble, Apple, Microsoft and 3 M were so sharp and sudden, they took even seasoned traders’ breath away.
So what happened? Pick your favorite electronic buzzword: an erroneous (fat finger) order, a runaway program trade, a mutant algorithm, an errant high-frequency trading platform, a bungled black box, a loose quant program. But it doesn’t really matter.
What matters is what didn’t happen. No one was there to say ‘Stop! Something looks wrong here’ or ‘Let’s slow things down and avert a potential catastrophe.’ It appears that a stronger uptick rule with some teeth is needed…. No one applied the emergency brake.”
Why do I find this quote intriguing and interesting in the context of this year’s mutant algorithm? To understand that we have to first know what this year’s mutant algorithm was.
Boris Johnson’s use of ‘mutant algorithm’
As The Guardian reported on 26 August, “Boris Johnson tried to blame a ‘mutant algorithm’ for the results chaos in a live address to pupils in England. The prime minister, who defended Ofqual’s controversial algorithm as ‘robust’ a fortnight ago, told children at an East Midlands school: ‘I’m afraid your grades were almost derailed by a mutant algorithm and I know how stressful that must’ve been for pupils up and down the country.’”
So, some external agent (mutant algorithm), set in motion (rails) by some unknown force somehow inexorably ‘derailed’ the poor children’s A-level grades.
As an ITV transcript of 26 August said: “You may think that we have heard it all in the exams fiasco, but today, no less a figure than the PM laid the blame squarely at the door of a mutant algorithm. Sounds dangerous, like a meteor striking from outer space. Exactly who should be blamed for deciding that this year’s results would be decided by an algorithm, mutant or not, is a more open question.”
Blaming a mutant algorithm for a man-made disaster is a bit like blaming the sun for another imminent man-made disaster, or even, blaming an ‘invisible enemy’ on a government’s failure to halt a pandemic. This is part of a blame-shifting language game to which we have become accustomed recently. The Irish Times (31 August) called the mutant algorithm the ‘latest scapegoat’ for government incompetence.
Unlike other blame games, this use of ‘mutant algorithms’ lends itself naturally to parody as in this letter to The Independent 28 August, after Johnson had given another speech urging children to attend school, but he was not yet advocating that they wear masks (that came later, as usual):
“It’s clear his Eton attendance – I hesitate to say education – clearly failed to ‘level him up’ to anything other than being a half-wit. However, on page five he takes the biscuit. How or why is it impossible to learn something wearing a face mask, as he says, ‘You can’t expect people to learn with face coverings’? Does he just open his mouth and let the next bit of drivel come out? Clearly the pupils at Castle Rock High School learned one thing – with or without a face covering – that given this government’s policies, it’s not so much a ‘mutant algorithm’ that’s ruining their educational prospects but a mutant prime minister.”
Algorithms, human action and responsibility
Let’s get back to the first use of ‘mutant algorithm’ in 2010, where we are given a list of synonyms, such as “an erroneous (fat finger) order”, “a runaway program”, “an errant high-frequency … platform”, “a bungled black box”, “a loose quant program”. In our case, the black box was less black than government officials later pretended and the quant program was as loose as its makers wanted it to be by not listening to experts who alerted them to that looseness, such as the Royal Statistical Society.
The other even more important part of the 2010 quote points towards another thing that could have been done but wasn’t, and I mean could have been done by humans and, dare I say, experts.
Somebody in charge should have said: “stop, slow down, put the emergency brakes on to avoid an impending catastrophe”!! This type of action can only be carried out by humans in this situation. They are in charge. They are responsible. Nothing else is. Not the algorithm; humans. But they did not take action and they did not take responsibility.
Image: Maxpixel
No comments yet, fill out a comment to be the first
Leave a Reply