The Barbican Centre’s gallery space The Curve has been taken over by one gargantuan art installation.
Trevor Paglen explores the ways in which artificial intelligence learns, focusing on the biases we teach these programmes. The implications of this have huge effects on our own futures as AI technology will have widespread applications.
This exploration is important but Paglen manages to do it in a visually captivating and somewhat humorous way. And the massive scale just makes all the more impressive.
He takes the ImageNet data set, a selection of over 14million images, which are used as a major input for AI and displays 30,000 of them along the curved wall. From a distance it looks like a wave of colour that blurs together, but upon closer inspection, is far more interesting.
He begins with ‘apple’ and all the images associated with that word. The collage of photographs then grows to include apple orchards, labourers and an unsettling number of people riding banana boats. Each set of images blurs into the other, interconnected in some way by the data set information.
Even at this point, the information is problematic. Labourers are only seen as people from Asian, Latino or Middle Eastern descent. There is one white man included in this section but instead of working hard labour, he sits in a cafe writing notes with a glass of wine.
The same racial biases arise throughout the entire tsunami of photographs. Investors are all old white men in suits and divorce lawyers are bizarrely mostly made up of middle-aged Asian men.
These racial issues are being passed onto new technologies which will then act upon these biases as if they were truths. The possible implications are mostly unknown because the ways in which an AI interprets such vast quantities of data is still unknown. And the many ways in which these technologies will be integrated into our everyday lives is still up in the air.
As you make your way along through The Curve, you come across ham and eggs sitting right above rats – not the most appetising juxtaposition. Pizza and pork align with abattoirs too.
The term money grabber is surrounded by the likes of Justin Timberlake, Donatella, professional boxers, Steve Jobs, Donald Trump, Janet Jackson and Sarah Palin. They’re set alongside pythons and syringes and other celebrities. They lie above bottom feeders and nose flute players. Because that’s a thing the ImageNet data set has decided to include too.
Traitors include the likes of Obama, George Bush, and Saddam Hussein – many perspectives are conflated here. Thankfully, Oprah hasn’t been corrupted though – she sits firmly in the centre of the group named prophetess.
It ends on anomaly – hence the name of the exhibition From ‘Apple’ to ‘Anomaly’. Apple is harmless but the further you move down to anomaly (which is mostly made up of random people dressed up in costumes) you come across some uncomfortable connections between people and things photographed and the categorization associated with them.
Move closer and further away as you glide along the onslaught of images to gain totally different perspectives. We even felt the great urge to go cross-eyed while walking about the space, making for a trippy kaleidoscope of colour which felt as if it was moving. From a visual perspective, it is mightily impressive.
But you can’t help but feel unsettled at the meaning behind it all. You feel powerless against the sheer mass of content – something AI takes on with ease.
But if this input of information is what lays the foundation for artificial intelligence’s reasoning and ethics then there is greater reason to be fearful. For machines use these biases (created by the human in control) to eventually elicit different forms of judgement against humankind. The pretty picture holds some darker possibilities.