Logo of the Open Data Institute

July 16, 2018, by Timothy Hill

Fear and Loathing on the Pokémon Go Trail. Also, Some Cautious Optimism: An Afternoon at the ODI

It’s 2 o’clock on a sultry summer afternoon; outside, London is still dreaming of a win against Croatia in the match that night. But here at the Open Data Institute Data Ethics seminar in EC2, things are going deep, fast.

I’m thinking out loud about medical records: patient data, researcher data. On my right, a woman from a think-tank and advocacy group is pondering the relationship between online services and hate-speech on their platforms. To my left, there’s a man from an organisation so secret he won’t tell us exactly why he’s there, beyond the fact that it’s got something to do with international security.

No-one knows what he’s thinking about. Maybe we don’t want to.

‘I guess …’ tails off the fourth person at the table, a researcher from a government ministry. ‘I guess it’s a question of whether what we need is a deontological or a utilitarian framework.’

No one’s too sure what he’s thinking, either.

But it’s a good bet that one thing on all of our minds is the Facebook/Cambridge Analytica judgement handed down that morning: a half-million pound fine for data breaches. And behind that, anxieties about the GDPR, a topic that recurs like a bad penny every time the conversation lulls. There’s a strong sense that almost everyone in the room is responsible for ensuring GDPR compliance it in their organisations. And an equally strong sense that no-one’s exactly sure what that means. People are worried.

But this is the Open Data Institute, founded by Tim Berners-Lee to ensure that the data-sharing web he created will ‘enable us to innovate, create more efficient and effective services and products, and fuel economic growth and productivity’. It’s a strange place to be thinking about how to shut data down. To get rid of it.

And, according to our convenor, that isn’t really what data ethics is all about anyway. The mission of the Institute, she says, is to help groups and people ‘use data to make better decisions, more quickly’. And there’s often at least as much of an ethical imperative to share as there is to keep data private.

I nod sagely. Open Science. Clinical trials. I should have thought of it before. And looking around the table, I see Think-Tank Woman and Deontology Guy are likewise nodding. Only International Security Man remains impassive.

But still there’s an air of scepticism in the room. Does Open Data come with a ten-million-euros-or-2-percent-of-global-turnover maximum fine, like the GDPR does?

We talk a little about Blockchain, and how it’s being used to ensure accurate identification for Syrian refugees: the way it aids food distribution now; how it could be misused in future. We talk about Uber, and surge pricing, and the way prices quadrupled during the Sydney hostage crisis. Is that exploitative, just a market efficiency – or something that’s actually desirable? Temperatures grow heated.

And then, says our convenor, there’s the question of Pokémon Go. Maybe we should workshop that out a little more closely, with the ODI’s Data Ethics Canvas?

We all reach to the centre of the table to grab our copy of the Canvas. There’s no doubt it’s an impressive document – filled with tough, incisive questions piercing straight to the heart of data, privacy, and security concerns. But even so, there’s no mistaking the atmosphere of anti-climax. Five minutes ago we were talking global geopolitics; now we’re on virtual anime bunnies or something.

Given the build-up, it’s not surprising that the conversation takes a dark turn. Immediately glomming on to all the direst squares in the Canvas, we talk of location-tracking. Of spoofing. Someone finds a website called Pokémon Go Death Tracker. And then there’s the question of disabled access to PokeStops – certain physical locations where players can collect valuable game rewards and tools free of charge. Beyond those locations, the items are still available – but you have to pay for your virtual tools with real money.  And what if the PokeStop is at the top of a rocky hill – and you’re in a wheelchair?

Where Go gets really flayed, though, is the socioeconomics of how PokeStop locations are selected. Initially they were created by crowdsourcing – with the crowd in question consisting largely of the kind of middle-class, mostly-white, mostly-male people who have the time and inclination to wander around scouting out good locations for game rewards. Maybe it’s a nice inclination. But the upshot is a game world where there’s plenty of free stuff and advantages for the middle classes. If you’re starting off in a disadvantaged neighbourhood – well, be prepared to pay up or lose out, sucka. By the time we’re done with it, Pokémon Go is looking less like a fun way of gamifying urban landscapes, and more like a sort of sinister, anime-bunny-fuelled social apartheid.

Gently the convener admonishes us to look at some of the other squares in the Canvas. How might we start to solve these problems?

There is a momentary silence. This is a new thought.

‘Some kind of user feedback?’ suggests the international man of mystery, tentatively.

‘People with disabilities could flag up inaccessible sites.’ nods the woman from the advocacy group.

‘Plus visualisation.’ I say. ‘You could publish a heatmap with the density of PokeStops per postcode.’

The ministry researcher is way ahead of me. ‘And incentivise it. Contribute to the map and get a PokeStop with extra Poke Balls for your neighbourhood!’

We huddle. Redemption is at hand for Pokémon Go.  Maybe a sliding scale for Poke Ball distribution? Easter eggs for frequent data-contributors? Before our eyes Go is morphing back from a tool of privilege into a game again – and from there maybe into something more beyond. Gamer outreach? A data community? A means of re-establishing community, across sundered socioeconomic lines?

Pokémon Go is maybe five minutes away from becoming the foundation of a new rainbow-coloured data-sharing utopia when the convener draws our attention to the time: the seminar is over. ‘So as you can see,’ she says, ‘the point of the Canvas isn’t to give you the answer. But it can help you understand the problems. And open up possibilities.’

It’s hard to believe three hours have passed. Gradually, the spell lifts.  A tiny door has opened in our minds, and now we’re peering through it cautiously.

‘It’s a lot to think about.’ says the woman from the advocacy group. Normally, of course, this means that you intend to forget about something immediately.

But she says it abstractly, wonderingly.

I nod, distracted by all the Googling I’m doing, looking at the ODI’s other projects. Capitalising on Open Banking. IoT sensors for Knowable Buildings. Education in Data Anonymisation.

Hands are getting dirty. Things are getting built.

‘It’s more pragmatic than I thought.’ says the ministerial researcher, brow furrowed into the middle distance. ‘But it’s … big.’

I nod again, and turn to the man from the organisation for international security to find out what he thinks. But already he has disappeared, along with any Postit notes and coffee cups that might have marked his presence among us.

Slowly we drift out into the warm evening air and an atmosphere now growing thick with expectation. A few early chants from the pubs are already spilling into the streets, as gradually our own new hopes and anxieties rise to mingle with those of the city around us; while, enveloping us invisibly, wi-fi and 4G hum with Facebook updates, Tweets, taunts, analysis, and last-minute bets being laid. I’m wondering how it might all go wrong. How it might all go right. And how it is that the notion of a nation united in a passion for a single game could have seemed so impossible and quixotic, just half an hour before.

Posted in Data EthicsDigital research topicsDigital securityUncategorized