‘Sorry for killing most of humanity’: New exhibition in San Francisco explores the dangers of AI

Have you ever wondered what life would be if artificial intelligence become too powerful? 

A new exhibition titled the ‘Misalignment Museum’ has opened to the public in San Francisco – the beating heart of the tech revolution – looks to explore just that, and features AI artworks meant to help visitors think about the potential dangers of artificial intelligence. 

The exhibits in this temporary show mix the disturbing with the comic, and this first display has AI give pithy observations to visitors that cross into its line of vision.

“The concept of the museum is that we are in a post-apocalyptic world where artificial general intelligence has already destroyed most of humanity,” said Audrey Kim, curator of the show.

“But then the AI realises that was bad and creates a type of memorial to the human, so our show’s tagline is ‘sorry for killing most of humanity.'”

Discussions about the safety of artificial intelligence have so far remained within “niche intellectual circles on Twitter,” Kim explained, adding that they have rarely been easily accessible to the general public.

The exhibit occupies a small space in a street corner building in San Francisco’s Mission neighbourhood.

The lower floor of the exhibition is dedicated to AI as a nightmarish dystopia where a machine powered by GPT-3, the language model behind ChatGPT, composes spiteful calligrams against humanity, in cursive writing.

One exhibit is an AI-generated – and fake – dialogue between the philosopher Slavo Zizek and the filmmaker Werner Hertzog, two of Europe’s most respected intellectuals.

This “Infinite Conversation” is a meditation on deep fakes: images, sound or video that aim to manipulate opinion by impersonating real people and that have become the latest disinformation weapon online.

The installation is scheduled to run until 1 May, but the creators hope to make it a permanent exhibit.

Source

Leave a Reply

Your email address will not be published. Required fields are marked *