Refiguring the Future
Eleanor Heartney | Brooklyn Rail | May 2018
Science fiction is not the only place where algorithms are beginning to control huge swaths of contemporary life. If you are arrested, algorithms can set your bail. If you are found guilty of a crime, they can determine your prison sentence. (In both cases studies reveal an algorithmic bias against black people). In certain states they decide who gets Social Services and who doesn’t. (As employed by Arkansas and Indiana, such algorithm-based systems have exacerbated inequality by greatly decreasing coverage for disability services, home care, and welfare benefits.) Algorithms decide who gets mortgages, who gets credit, who gets hired, who is pulled aside by TSA for additional screening at the airport. And of course, recent revelations about Facebook’s use of personal information have made us all aware that algorithms determine what news we see, what products we buy and even, perhaps, who we vote for.
Is there any way to mitigate the pernicious impact of the algorithmic takeover of life? Refiguring the Future, a conference held this May in Chicago, positioned artists as a first line of defense against Big Tech. Organized by The NetGain Partnership (a collaboration between the Ford Foundation, Knight Foundation, MacArthur Foundation, Mozilla Foundation and the Open Society Foundation), it presented a variety of speakers, including artists, funders, academics and technology specialists who critiqued the current system and suggested various lines of resistance.
A few takeaways from the conference:
* Artificial Intelligence (AI) is largely being created by young straight white males whose assumptions and unconscious biases are then baked into the algorithms adopted by both industry and government. (As a counterbalance, women, people of color, and non-straight individuals made up the majority of the conference presenters and attendees.) This bias can have both amusing and tragic consequences. These include the creation of face recognition systems that don’t recognize darker skin and the reliance on a training database of internet sourced images whose most recurrent face, because the gathering was done in 2002, is that of George W Bush.
* There is often no transparency about automated decision making systems because the private companies developing these algorithms claim proprietary rights, making it very difficult to appeal or resist their determinations.
* AI doesn’t have to be regressive socially—there are also programs that help individuals navigate forbidding bureaucratic systems. But because overall the aim of most AI systems is to increase efficiency, often at the expense of jobs and social equality, they aren’t a great fit with larger goals of justice and social progress.
ELEANOR HEARTNEY is a New York-based art critic and the author of numerous books about contemporary art. Eleanor Heartney's Postmodern Heretics: The Catholic Imagination in Contemporary Art has just been reissued by Silver Hollow Press