

WHEN AI BECAME A TOOL WITH publicly accessible interfaces, I got an influx of ads suggesting I try an app, Gencraft, to generate a kind of dream woman. The app can ostensibly generate any image, but ads and algorithms favor pretty girls. A few of the prompts I screenshotted—perhaps encouraging the algorithm to send me more ads—read:
“A French girl with brown hair wearing a red leotard.”
“A woman with blonde hair dressed in battle armor.”
“Back shot of a girl with an intricate elvish back tattoo staring at a lake.”
The resulting images all showed beautiful women; the prompt did not even need to ask. They were also all skinny and white. Every one smiled, or was calm and relaxed, the only imperfection being face-framing flyaway strands of hair that made them more real—or, in the case of the French girl, a neck impossibly long, yet still incredibly graceful.
I suspect I’m not the only feminist to take umbrage at these images. Feminist science fiction writers like Donna Haraway and Ursula K. Le Guin imagined that technologies like AI might be a kind of gender ender: They dreamed that as the virtual world gained primacy over its corporeal analog, and as our own bodies became more reconfigurable and customizable, gender binaries might become obsolete.
But traditional ideas about gender are proving stubborn in the virtual realm, starting with the AI assistants and their sycophantic feminine voices as well as those girls (French, tattooed, blonde) the app cooked up. AI can create anything, and yet there it was, advertising something so boring and unimaginative.
And the girls in the ads are far from alone. Male fantasies à la the movie Her abound, centered around a subservient love interest with no bodily or emotional needs of her own. Apple’s app store now boasts dozens of AI girlfriend apps, some of them costly, with options ranging from “sexy anime girls” to women “always here to talk and listen.” Two apps, Replika and CrushOn AI, have been downloaded more than 100 million times. Even Instagram offers AI girlfriends (and to be fair, boyfriends, and friends) right there in messages, ready to chat. (Robo-girlfriends have even broken up marriages: Ashley Madison, a website promising to help married people looking for secret affairs find one another, was hacked in 2015; the leaked data revealed that tens of thousands of men—who paid hundreds of dollars and risked their relationships and families to pursue love online—had not realized the “women” they were chatting with were bots.) With AI, you can actually interact with these superwomen, and do so in whatever manner you please: Unlike with 1 (900) numbers, for instance, she’s not a human, so no one can see or hear, and there are no real consequences.

WHAT’S A FEMINIST TO DO? AI is unlikely to go away, but we can work to change the culture around it. That’s what artists Ann Hirsch and Maya Man are doing with their new AI-generated “Ugly Bitches.” For the project, they trained an AI on a series of dolls, asking it to turn the conglomeration into a kind of average. The resulting images are a weird mess: artworks that chafe at beauty standards and manifest the artists’ shared abhorrence of the kind of “aspirational femininity” (per Man) that now proliferates online.
The project began in July 2022, when Hirsch was critiquing NFTs in a video series she produced for the digital platform Outland, where she described the feminist NFT sphere as “pretty sad.” She showed a few series of NFTS that amounted to illustrations of generically pretty and successful women, several of which, it turned out, had in fact been authored by men—including one accused of pedophilia and of enlisting NFTs as an opening to groom young girls online. (Hirsch’s work in the 2010s confronted her own childhood encounter with an online pedophile.)
In her Outland video, Hirsch, effusive and peppy yet clearly annoyed, decried these projects before declaring that she longed to see representations of “ugly women, who lie about being abused, seek full custody of their children, and get, like, a million abortions.” Man was watching that video when she sent Hirsch a message that said, in effect, “you should make that project.” She then offered Hirsch some ideas about how to do it, and soon, they were collaborating, both fed up with “feminist” work that celebrated successful and beautiful women, and reinforced the idea that these qualities were essential for women to have value.
Thus “Ugly Bitches” (2022–) was born. Man and Hirsch trained a type of AI called a generative adversarial network (GAN) on a series of dolls, dolls being a means by which feminine as well as racialized ideals got codified and taught to young girls long before the algorithm, and that Man sees as representing “peak aspirational femininity.”
That was two years ago, and already, the clunky GAN looks dated, the dolls glitchy and dirty, their eyes misaligned. This works, conceptually: “We love that they came out looking imperfect and freaky,” Man said, “in the same way that we as women try to replicate beauty that we see all the time, yet always kind of fail and come out as ourselves instead … the GAN mimics that process.” The project is a kind of “glitch feminism,” a term coined by Legacy Russell in her 2020 book of the same title.

Trained on Cabbage Patch, Bratz, and American Girl dolls, the GAN-generated “Ugly Bitches” are placed in backdrops made using DALL-E that depict places one might find in the background of an influencer’s photos: a beach, a dance studio, a skyline. Man wrote a script to pair each doll randomly with a background and a line of text, printed at the bottom of each image. The artists sourced the language from the Instagram pages of hot girl influencers like Kendall Jenner and Addison Rae, whose posts garner thousands of comments. To the artists, those comments, which few followers actually read, “offered perspective on how people view these women online.” Man and Hirsch swapped all the adjectives in the comments—“beautiful,” “gorgeous,” “perfect”—for “ugly,” and replaced words like “girl” and “babe” with “bitch.” One example reads “crying and screaming right now at your ugly,” followed by sobbing and heart-eyed emojis.
While the duo found that most people got the joke, “it seems like there were some people who just think we’re being misogynist,” said Hirsch, describing a catch-22 that feminist artists have been grappling with for more than 50 years. How can you reclaim, critique, or comment on gender stereotypes without reiterating them in the process? In the NFT sphere, the effect was especially strange: Man and Hirsch started to receive messages from the crypto bros who would become their collectors, saying things like “I really want an ugly bitch.” Hirsch came to the same realization as so many feminist artists before her: “You will never be able to subvert the male gaze. It always finds a way to fetishize everything. There’s no point in trying to escape it.”
Launching a plushy version of the doll at the Museum of Contemporary Art, Los Angeles, Man and Hirsch put on a performance dressed in black turtlenecks, recalling the cult leader energy of Elizabeth Holmes or Holmes’s model, Steve Jobs. A slide deck played in the background, asking questions like “After scrolling Instagram for five minutes, do you feel ugly?” Participants in the audience took the quiz in real time, and their answers were projected on the screen. The doll, which is for sale in the MoCA gift shop, comes with a cheeky sales pitch: “Lil’ Ugly will free you from aspirational desires, encouraging you to embrace your inner ugly bitch.”
FEMINIST ARTISTS HAVE BEEN BLENDING the pretty and the ugly for half a century now, beginning with the abject performances of the 1970s. For Catalysis VII (1971), Adrian Piper dressed up “very super femininely,” as she put it, and walked around the Metropolitan Museum of Art in New York, chewing gum
and blowing bubbles so large that they threatened the personal space of other visitors. Later, as our mediascape and its images of perfect women became all-consuming, feminist video artists like Pipilotti Rist responded. Rist’s 2008 Pour Your Body Out (7354 Cubic Meters)—a blockbuster hit, immersive and mesmerizing—features a woman collecting her menstrual blood in a silver chalice.

Earlier in our algorithmic age, Anna Uddenberg and Gina Beavers literalized beauty ideals from the virtual realm in three-dimensional works, underlining the absurdity of beauty standards that looked great on-screen but felt uncanny when replicated in real life, like pneumatic lips and pair-of-basketballs butts. In her life-size figurative sculptures, Uddenberg depicts female bodies whose proportions—and contortions—are exaggerated far into the uncanny valley. In FOCUS #2 (pussy padding), 2018, a blue-clad figure threads her head between her legs to take a pussy pic with a selfie stick, her nose nearing her hind end. Uddenberg seems to be picking up on makeup tutorials and Instagram ads promoting a frustrating, and often absurd, cultural ideal, manufacturing anxieties and desires in order to sell a product. This year, my own algorithm, for instance, wants me to worry about whether my skin is adequately “bouncy.”
In her sculptural paintings, Beavers portrays this online beauty culture by illustrating step-by-step makeup tutorials across gridded panels. Some show more traditional techniques: steps to achieving smoky eyes and red lips. Others blend makeup art and high art, showing you how to replicate Starry Night on your (giant) lips. Rendered in bas relief on paper pulp with oil paint, Beavers’s images are sculptural paintings that threaten to jump into the 3D realm. With their matte finishes and slightly imperfect renderings, these artworks are about beauty, but are adamantly not beautiful themselves. Artists like Uddenberg and Beavers aren’t mocking the women drawn to such feminine markers; that would just be another category of woman-hating. Instead, their works explore the artists’ own ambivalence.


THE CASUAL DEHUMANIZATION that both AI and porn enact—and exacerbate when they come together—is the subject of new work by Swedish artist Arvida Byström. In late 2023, she fed images of herself into a website called undress.app, one of many “nudify” apps that allow users to feed images to an AI that responds with a nude version of the subject. In just one month, nudify apps get more than 24 million visitors; many of them only re-create nude female bodies, and they can be used without the consent of the person in the image. Byström collected all of her own nudified images in a book titled In the Clouds (2024). It begins with pretty standard pornographic pics, the only real intervention being the AI’s: unsurprisingly and unoriginally, it enlarges Byström’s breasts.
The artist’s real intervention begins as you turn the pages. The images begin to reflect Byström’s understanding of how the AI sees. She learns maneuvers that throw the technology off. In a few images, she wears nude or peachy outfits, similar in tone to her skin, and the AI grows confused as to where her skin ends and her clothes begin. In one image, her vulva is contiguous with her loud pink unitard, making her whole body vulval. Elsewhere, the AI’s confusion results in what appears to be a third hand, a third nipple, a truncated leg. Further along, Byström ups the ante, donning a clown nose, which prompts the AI to add yet another nipple—as if the red ball completes the second pair. In yet another image, she twists her neck around, showing the camera both her face and her backside: The AI responds with a clitoris where an anus should be.

Some of the results are thoroughly nonanatomic, even nonhuman—and yet alluring. My favorite shows a jumbled mass of belly buttons and leglike limbs underneath several pairs of underwear in nude and lavender and satin. This fleshy conglomerate feels vaguely seductive despite having only navels for orifices, only knee-like lumps for appendages. A body doesn’t need to exist to seduce: So argues Slavoj Žižek in his essay accompanying the book, noting that dozens of millions of people watched a deep fake pornographic video involving Taylor Swift, knowing full well it was not her.
Most AI imaging apps ban sexualized photos; it’s not hard to imagine the abuse this would enable. Even undress.app’s policies have changed since Byström began her project. But the fact remains that we now have this hive mind trained on internet data that is full of pornography and skewed heavily toward the male gaze and patriarchal fantasies. Byström found a way to make the AI show us what it knows, and what its programmers want to hide: what it sees, based on what we show it.

Byström started out as a child model, but at 13, hit puberty and was told her hips were too big. “I felt horrible about it,” she recalled in Nylon, “but then I got into feminism, quit modeling, and felt great about my body.” Her selfie art got going as part of the post-internet movement in the 2010s; ever since, she has seen imaging herself as a way of reclaiming her image from the modeling industry, and as a way of avoiding the issue of objectifying someone else. As necessary as Byström’s reclamation of her own image is, the work starts to become problematic when it gets framed as “feminist” in a way that implies the artist stands in for all women. In an important 2016 essay on the whiteness of “selfie feminism,” artist and critic Aria Dean describes how the demands of the male gaze are uneven, with Black women at once “surveilled and in the shadows, hypervisible and invisible.” She then describes the risks of feminist projects, including Byström’s: “So long as the feminist politic with the most traction enjoys this uncomplicated relationship to visibility, it will only sink further into aestheticization and depoliticization.” In other words, a woman taking her own picture on her own terms is not enough to disrupt the patriarchal status quo. Indeed, today, it is not so much magazines setting beauty standards as it is influencers, supposedly ordinary women imaging themselves. Still, this has not freed us from the trap: It has only allowed certain women to profit from it.
Dean is right, but I think Byström’s newer AI collaborations confront and implicate the viewer—and, by extension, our culture—more directly than Byström’s earlier selfie feminism did. As with Uddenberg’s sculptures and Piper’s performance, Byström here summons the gaze only to tell it to fuck off. It’s true that she is able to do this precisely because she has the kind of skinny white body that the AI has been trained on. That she forces it to regurgitate the most boring and normative lowest common denominator fantasies, is exactly the point. As Byström put it, “Maybe we get so oversaturated with this normative beauty that it’s no longer interesting to us anymore.”