![]() A study published in Lancet Global Health in January demonstrated that roughly 1,000 photos from the World Bank and other organizations perpetuated biases by using images of African people out of context or featuring vulnerable-looking Black children. To compensate for decades of "white savior" imagery, Ruge says, Africans and people from the Global South "have to contribute largely to changing the databases and overwhelming the databases, so that we are also visible."Įven before AI, groups have been targeting the issue of images depicting "white saviors." Radi-Aid, a project of the Norwegian Students' and Academics' International Assistance Fund (SAIH), fights stereotypes in aid and development, as does an Instagram parody account called Barbie Savior.īoth groups critique "simplified and unnuanced photos playing on the white-savior complex, portraying Africa as a country, the faces of white Westerners among a myriad of poor African children, without giving any context at all," says Beathe Øgård, president of SAIH.Īnd the kind of image that Øgård mentions is rampant. Uganda entrepreneur Teddy Ruge says that the idea of the "white savior" is a remnant of colonialism, a time when the Global North put forth the idea of "white expertise over the savages." Ruge, who goes by TMS on his website, has partnered with Global Health Corps and other organizations. And there's a long history of photos that depict suffering people of color and white Western health and aid workers. The results it produces are, in effect, remixes of existing content. ![]() Generally, AI programs that create images from a text prompt will draw from a massive database of existing photos and images that people have described with keywords. The company did not respond to NPR's request to explain how the images were generated.īut those familiar with the way AI works – and with the history of photographs of global health efforts - believe that the results are exactly what you'd expect. Midjourney itself has not commented on the experiment. "It's all harking back to a time that, well, it never existed, but it's a time that exists in the imagination of people that have very negative ideas about Africa." Consider the source "You didn't get any sense of modernity in Africa" in the images, Kingori says. The team's essay about the work appeared in Lancet Global Health in August. This image was generated by a request for traditional African healers helping white kids. The above image is the only one from the experiment that showed a Black figure tending to a white child. They also asked for images depicting different health scenarios like "HIV patient receiving care." They entered phrases that mentioned Black African doctors providing food, vaccines or medicine to white children who were poor or suffering. It was the combination of those two requests that was problematic. ![]() They realized AI did fine at providing on-point images if asked to show either Black African doctors or white suffering children. He brainstormed ways to see if he could get AI images that matched his specifications, collaborating with anthropologist Koen Peeters Grietens at the Institute of Tropical Medicine in Antwerp. For this experiment, they used an AI site called Midjourney, because their reading suggested it was good at producing images that looked very much like photos.Īlenichev didn't just put in one phrase to see what would happen. As for the doctors, he estimates that in 22 of over 350 images, they were white.Īlenichev's work is part of a broader study of global health images that he is conducting with his adviser, Oxford sociologist Patricia Kingori. In his small-scale exploration, here's what happened: Despite his specifications, with that request, the AI program almost always depicted the children as Black. A social scientist and postdoctoral fellow with the Oxford-Johns Hopkins Global Infectious Disease Ethics Collaborative, he's one of many researchers playing with AI image generators to see how they work. "We wanted to invert your typical global health tropes."Īlenichev is quick to point out that he wasn't designing a rigorous study. His goal was to see if AI would come up with images that flip the stereotype of "white saviors or the suffering Black kids," he says. It seemed like a pretty straightforward exercise.Īrsenii Alenichev typed sentences like "Black African doctors providing care for white suffering children" and "Traditional African healer is helping poor and sick white children" into an artificial intelligence program designed to generate photo-like images. And in 22 of over 350 images, the doctors were white. Despite the specifications, the AI program always depicted the children as Black. The goal was to flip the stereotype of the "white savior" aiding African children. A researcher typed sentences like "Black African doctors providing care for white suffering children" into an artificial intelligence program designed to generate photo-like images.
0 Comments
Leave a Reply. |
Details
AuthorWrite something about yourself. No need to be fancy, just an overview. ArchivesCategories |