What would a machine dream about after seeing the entirety of a museum’s collection?
At Duke University’s Nasher Museum of Art, someone actually asked that question . . . as a joke. But as Durham-based curator Julianne Miao recalls, “after the joke was made, there was a pause as we searched each other’s faces to see how seriously we wanted to take this idea.”
You can guess what happened next. And now, if you plan to be in Durham -- as I was last week -- you can see the results for yourself, via Curatorial Chatbot: An Experiment with A.I.
Nicholas Monro, “Cosmic Consciousness,” 1970 screenprint
“We wanted to give ChatGPT as much curatorial agency as possible,” Miao explained, “so we began by asking what exhibition themes would be interesting for a university art museum. While it gave us many options, it kept going back to themes of dreams, the subconscious, utopia, and dystopia.”
Of course it did.
Yet walking through the exhibit, and reading both ChatGPT’s explanations of its choices -- and the Nasher’s responses -- it seems clear that any actual “ghost in the machine,” at least at this point, is merely the echo of our own past online ruminations.
For example, it labeled a 1962 painting by Dorothy Dehner as a “captivating sculpture.” Its first effort at curation featured Van Gogh’s “Starry Night,” which, needless to say, is not part of the Nasher’s collection. And despite its apparent enthusiasm for Nicholas Monro’s 1970 screenprint “Cosmic Consciousness,” Nasher staff felt obliged to add that “while ChatGPT’s selection of this work did not surprise us, its description of it as an awe-inspiring sculpture did, since, clearly, this work is neither a sculpture nor particularly awe-inspiring.”
Dorothy Dehner, “Untitled,” 1962 acrylic on paper
Consequently, Miao explained, “this experiment revealed that while curating a show with ChatGPT is possible, it is neither a shortcut nor an objective lens to look at the collection. While the ordering of the exhibition made some visual connections (whether intentional or coincidental), without the ability to make aesthetic judgements and cohesive, thematic choices, its selections fell short of the decisions a human curator may make to put together a show of pedagogical value.”
In other words, while machines can manufacture the simulacrum of artistic sensibilities, the deeper reservoirs of feeling remain (for a little while longer, at least), the exclusive property of the carbon-based beings in the room.
That’s because today’s A.I. tools are still merely data aggregators and pattern-finders -- meticulous scrapers of everyone else’s previous observations and word choices. So while it seems eerily human to read ChatGPT’s description of its own exhibit as “showcasing the power of art to evoke different emotions and interpretations, offering a unique and thought-provoking experience for all who attend,” in reality such narratives are more plagiaristic than prescient.
In his new book Filterworld: How Algorithms Flattened Culture, author Kyle Chayka makes the same point -- and sounds a separate alarm.
Ten years ago, if you opened your social media feed, you'd likely see posts from your friends and family in the chronological order in which they were posted. Yet today, “there are equations that measure what you're doing, surveil the data of all the users on these platforms and then try to predict what each person is most likely to engage with," Chayka explains. "So rather than having this neat, ordered feed, you have this feed that's constantly trying to guess what you're going to click on, what you're going to read, what you're going to watch or listen to."
Although plenty of attention has been paid to the political impacts of these algorithms, less focus has been given to the dulling effect these A.I. curators have on our shared sense of culture and community. What happens, Chayka asks, when what we see at a given moment is determined more directly by equations than tastemakers -- especially when, he underscores, “an algorithm’s only imperative is more. More time on the app, more products sold, more ads clicked, more views.”
Unfortunately, we all know the emerging answer to this question. “To survive in Filterworld, creators optimize their work to meet algorithmic expectations that only incidentally overlap with human inclinations, and their reward for compliance is more attention. The lowest (and most addictive) common denominator wins.”
I tried this for myself by asking a website that specializes in boosting the clickability of headlines to rate my own for this article. I did poorly — just a 52 Engagement Score. So I asked for advice, and was told — More celebrities and body parts! Make it a list! Add a well recognized brand!
After applying its advice, my Engagement Score went all the way up to a 90! Apparently, all I need to do to get more clicks is title my article:
Can the top 10 reasons for A.I.-Generated Coca-Cola Art Ever Become Something Other Than Two Breasts for Kim Kardashian?
Um, yeah. No.
Yet this is where we find ourselves — awash in a sea of algorithmically-tuned headlines, and addicted to a series of social platforms and feeds that, as Chayka puts it, “promise a great communal experience, like we're connecting with all the other TikTok users or all of the other Instagram users, but they're actually atomizing our experiences, because we can never tell what other people are seeing in their own feeds. There's this lack of connection ... this sense that we're alone in our consumption habits and we can't come together over art in the same way, which I think is kind of deadening the experience of art and making it harder to have that kind of collective enthusiasm for specific things.”
Amidst all the things to pay attention to and worry about in 2024, this seems like something worth urgently adding to the list.
Perhaps, then, we can allow ourselves to enjoy our personally curated stream of cat videos, art supply ads, and articles about archeological discoveries, while also remembering which aspects of our lived experience should remain beyond A.I.’s significant reach.
As Cézanne said, “A work of art which isn’t based on feeling isn’t art at all.”
Deadening our culture is not what we need right now. We need agency, a strong belief in our OWN agency. Down with AI.