Anew book about big data, digital surveillance, facial recognition software and invasion of privacy has just been published in SA.
Internationally, quite a few books on these subjects have been released lately. They include such catchily titled fun reads as The Age of Surveillance Capitalism: The Fight for a Human Future at the New Frontier of Power and Weapons of Math Destruction: How Big Data Increases Inequality and Threatens Democracy.
What makes this one a little different is that it’s called Boris the BabyBot: A Little Book about Big Data, and that it’s written for children.
Children’s fairy tales have a long history of harnessing appalling and scary subjects to warn of the perils of the world. Hansel and Gretel, for example, teaches children about infanticide (the mother leaves the children to die in the woods because she can’t feed them), cannibalism (they are kidnapped by a witch who eats children), and how to murder adults for profit (Gretel burns the witch alive, and is rewarded by the discovery of a vase full of jewels).
Cultural theorists have suggested that the tale of Hansel and Gretel arose out of the great medieval famines of the 14th century, when life expectancy — for rich people — was 29.84 years.
There are quite a few examples of children’s fairy tales and playground nursery rhymes being reflective of some great societal calamity, albeit retrospectively in some cases. Take the playground song "Ring a Ring o’ Rosie", one version of which goes something like this: "Ring around the rosies / A pocket full of posies / Ashes! Ashes! / We all fall down."
Popularly accepted explanations (though not really historically accurate) suggest that the song refers to the 17th century Great Plague in England, with the rosy rash as a plague symptom, and the posies of herbs carried to counteract the smell of the disease. "We all fall down" speaks for itself.
In the same way, Boris the BabyBot is a harbinger of one of the great, and potentially disastrous, social challenges of our time: global surveillance driven by artificial intelligence (AI).
As astonishing as it might seem to many of us, we now have to teach children about how big data invades their lives, and how AI and surveillance will control the way they live and the societies in which they live. And, crucially, how digital surveillance will infringe on the basic freedoms democracies hold dear.
The book starts with as ominous an opening as any of the great classics. "Boris the BabyBot is sent from the factory to track all the babies. First, Boris needs a picture of baby’s face: Beep beep boop! Scanning baby’s eyes!"
It’s probably worth mentioning here that the author of Boris the BabyBot is Murray Hunter, recently of SA advocacy organisation Right2Know Campaign.
Right2Know’s 2018 report, "Spooked", revealed how various state security agencies, as well as the private sector, spied on SA journalists.
The organisation was also party to a successful application brought by investigative journalism outfit amaBhungane to have the National Communication Centre’s collection and analysis of large amounts of the public’s metadata — without any legal oversight — stopped and declared unconstitutional.
Children today need to be taught about how big data invades their lives
— What it means:
In September, the high court in Pretoria declared this mass surveillance unlawful and invalid.
"The Global Expansion of AI Surveillance", a 2019 report from the Carnegie Endowment for International Peace, says: "A growing number of states are deploying advanced AI surveillance tools to monitor, track, and surveil citizens to accomplish a range of policy objectives — some lawful, others that violate human rights, and many of which fall into a murky middle ground. AI surveillance technology is spreading at a faster rate to a wider range of countries than experts have commonly understood. At least 75 out of 176 countries globally are actively using AI technologies for surveillance purposes."
Not all uses of AI surveillance technology are illegitimate or nefarious at the outset, but because regulations and laws haven’t caught up yet, they all have the potential to end up in that murky middle ground.
The Joburg suburb of Parkhurst has just signed a contract with a company called Vumacam that, according to Vice magazine, will result in the deployment of CCTV cameras using software called iSentry.
This, says Vice, "works by fixing a camera to a single spot and letting it film for an extended period of time. Eventually the software learns what to look for. It issues alarms when it determines something is ‘abnormal’ — like loitering pedestrians or minivans — and prioritises video streams for review by human operators in a control room."
Once the camera rollout is sufficiently widespread, says Vice, "people and cars can be identified and tracked across the networks, behaviour scrutinised by automated systems for security and police forces, and patterns of life evaluated by algorithms for government and corporate analysis".
Or, as Boris the BabyBot explains it: "Boris sends reports about all the babies to The Factory! The Factory makes money by deciding who’s a Good Baby and who’s a Bad Baby."
Vumacam says it will soon have 15,000 cameras blanketing Joburg. That includes a rollout in Alexandra and Soweto when fibre is installed there, so they can be added to the smart-camera network. The company is also rolling out camera surveillance to schools in Joburg as part of its Kids Custodian Initiative.
Boris the BabyBot would approve, though the rest of us may feel a premonitory dystopian shiver.
Some may say it’s all to the good: what noncriminal South African (there are a few left) wouldn’t be for a system that can help bring down our incredibly high levels of crime?
But as Vice points out, algorithms are not neutral.
"AI is often imbued with bias by the small and homogenous groups of engineers who create it," Artificial Unintelligence author Meredith Broussard is quoted as saying. "The problem with these kinds of systems is that they tend to pick up all the underlying racist assumptions of the culture that makes the system."
David Lemayian, my colleague at civic tech organisation Code For Africa, puts it succinctly in a Daily Maverick article: "Algorithms can be racist."
You don’t need me to point out how AI digital surveillance can be used for evil. But I’m going to anyway: the Chinese government, for example, has about 200-million surveillance cameras as part of its Sharp Eyes programme, which evaluates and ranks the trustworthiness of citizens. Face scanners at the entrances of shopping malls, mosques and at traffic crossings allow the government to track and control the movement of citizens and their access to phone and bank services.
Cameras have been rolled out extensively in the Xinjiang province, where a Muslim minority ethnic group called the Uighur makes up almost half the population.
According to Human Rights Watch, "since late 2016, the Chinese government has subjected the 13-million ethnic Uighurs and other Turkic Muslims in Xinjiang to mass arbitrary detention, forced political indoctrination, restrictions on movement and religious oppression. Credible estimates indicate that under this heightened repression, up to 1-million people are being held in ‘political education’ camps."
The kind of everyday behaviour that Xinjiang authorities consider suspicious includes "not socialising with neighbours, often avoiding using the front door".
And, yes, there’s an app for that. It collects data, reports on suspicious activities and prompts investigative missions. When the app finds someone who it thinks is acting suspiciously, it alerts the government.
Unsurprisingly, China is a major purveyor of AI surveillance worldwide. Chinese companies supply AI surveillance technology in 63 countries, and Huawei — the company blacklisted by Donald Trump’s administration over security concerns — is responsible for providing AI surveillance technology to at least 50 countries.
In Kenya this month, the company made a presentation to a gathering of mayors and government officials from all over Africa, selling the virtues and opportunities attendant on setting up an AI-driven surveillance network.
According to the Mail & Guardian, after the presentation there was a "queue of local government officials wanting to speak to Huawei’s representatives. The pitch appears to have worked."
China isn’t the only culprit when it comes to pushing the virtues of social control through digital surveillance. US surveillance technology companies are present in 32 countries, with Germany, Japan and France also contributing.
If you want a scary surveillance statistic: by 2028 the US will have more lightweight surveillance drones than the rest of the world combined — 43,001 of them (The Guardian’s strangely precise number) capable of misidentifying wedding parties as terrorist gatherings for combat drones to bomb.
And, of course, the Four Tech Bros of the Apocalypse — Google, Facebook, Twitter and Amazon — basically dictate how we see our world today.
There is hope for us, and there are solutions. In the children’s book, Boris the BabyBot is eventually thwarted. "There is still one baby that Boris hasn’t tracked. Beep beep blaarrp! Too much mess to see baby’s face!"
A Polish designer has created facial jewellery that allows you to stay anonymous and not be tracked, and in Hong Kong activists are using face masks to foil facial recognition software. (A government ban on face masks is before the courts there — see page 40.)
There are ways to mess with the system. But this is a cosmetic messiness — the real messiness we have to create, as Lemayian writes, is by petitioning and lobbying our governments to adopt a governance framework for algorithmic accountability and transparency.
Hardly any of the old fairy tales ever had a happy ending — but what they do teach is a way of looking at the world that allows you to fight back.






Would you like to comment on this article?
Sign up (it's quick and free) or sign in now.
Please read our Comment Policy before commenting.