Topics
ABOUT

Why Is AI So Bad at Generating Images of Kamala Harris?

Will Knight
5–6 minutes

When Elon Musk shared an image showing Kamala Harris dressed as a “communist dictator” on X last week, it was quite obviously a fake, seeing as Harris is neither a communist nor, to the best of our knowledge, a Soviet cosplayer. And, as many observers noted, the woman in the photo, presumably generated by X’s Grok tool, had only a passing resemblance to the vice president.

“AI still is unable to accurately depict Kamala Harris,” one X user wrote . “Looks like they’re posting some random Latina woman.”

“Grok put old Eva Longoria in a snazzy outfit and called it a day,” another quipped, noting the similarity of the “dictator” pictured to the Desperate Housewives star.

“AI just CANNOT replicate Kamala Harris,” a third posted . “It’s uncanny how failed the algorithm is at an AMERICAN (of South Indian and Jamaican heritage).”

Many AI images of Harris are similarly bad. A tweet featuring an AI-generated video showing Harris and Donald Trump in a romantic relationship—it culminates in her holding their love child, which looks like Trump—has nearly 28 million views on X. Throughout the montage, Harris morphs into what look like different people, while the notably better Trump imagery remains fairly consistent.

When we tried using Grok to create a photo of Harris and Trump putting their differences aside to read a copy of WIRED, the results repeatedly depicted the ex-president accurately while getting Harris wrong. The vice president appeared with varying features, hairstyles, and skin tones. On a few occasions, she looked more like former First Lady Michelle Obama.

Grok is different from some high-profile AI image generators in that it allows users to create faked photos of political figures. Earlier this year, Midjourney began blocking its users from creating images of Trump and President Joe Biden. (The ban extends to Harris.) The move followed publication of a report by the Center for Countering Digital Hate that found that the tool could be used to generate a range of politically charged images.

Similarly, OpenAI’s ChatGPT and Google’s Gemini refused to produce images of Harris or Trump in WIRED’s testing. Meanwhile, a number of open source image generators will, like Grok, produce images of politicians. WIRED found one such model, Stable Diffusion, also produced not-great pictures of Harris.

Modern AI image generators use what are known as diffusion models to generate images from text prompts. These models are fed many thousands of labeled images, typically scraped from the web or collected from other sources. Joaquin Cuenca Abela, CEO of Freepik , a company that hosts various AI tools, including several image generators, tells WIRED that the difficulty such generators have conjuring up Harris, compared to Trump, is that they have been fed fewer well-labeled pictures.

Despite being a prominent figure, Harris hasn’t been as widely photographed as Trump. WIRED’s search of photo supplier Getty Images bears this out; it returned 63,295 images of Harris compared to 561,778 of Trump. Given her relatively recent entry into the presidential race, Harris is “a new celebrity,” as far as AI image makers are concerned, according to Cuenca Abela. “It always takes a few months to catch up,” he says.

That Harris is a Black woman, of Jamaican and Indian descent, also may be a factor. Irene Solaiman , head of global policy at AI company Hugging Face, says that “poorer facial recognition for darker skin tones and femme features” may affect the sorting of images of Harris for automated labeling. The issue of facial recognition failing to identify female and darker-skinned faces was first highlighted by the 2018 Gender Shades study published by Joy Boulamwini, an MIT researcher, and Timnit Gebru , now the founder and executive director of the Distributed Artificial Intelligence Research Institute.

There may be yet another reason why AI portrayals of Harris are not especially good. “The images are not being created to be photorealistic but rather are being created to push a narrative,” says Hany Farid , an expert on deepfake detection and cofounder of GetReal Labs , a startup offering software to catch fake media.

In other words, those sharing AI-generated images of Harris may often be more interested in producing meme-worthy scenarios than refining the realism of her likeness. The “communist dictator” image shared by Musk and the video in which Harris holds her Trumpy baby both serve to ridicule and denigrate the Democratic candidate rather than spread disinformation.

Ari Lightman , professor of digital media and marketing at Carnegie Mellon University’s Heinz College, says some people may even be selecting bad Harris images on purpose in an effort to emphasize the idea that she is a fraud. “This is an AI-generated communications era,” Lightman says. “If it’s done crudely, it’s designed to send a message.”