It sounds like a scenario straight out of a Ridley Scott film: technology that not only sounds more “real” than actual humans, but looks more convincing, too. Yet it seems that moment has already arrived.
A new study has found people are more likely to think pictures of white faces generated by AI are human than photographs of real individuals.
“Remarkably, white AI faces can convincingly pass as more real than human faces — and people do not realize they are being fooled,” the researchers report.
Photo: Reuters
The team, which includes researchers from Australia, the UK and the Netherlands, said their findings had important implications in the real world, including in identity theft, with the possibility that people could end up being duped by digital impostors.
However, the team said the results did not hold for images of people of color, possibly because the algorithm used to generate AI faces was largely trained on images of white people.
Zak Witkower, a co-author of the research from the University of Amsterdam, said that could have ramifications for areas ranging from online therapy to robots.
“It’s going to produce more realistic situations for white faces than other race faces,” he said.
The team caution such a situation could also mean perceptions of race end up being confounded with perceptions of being “human,” adding it could also perpetuate social biases, including in finding missing children, given this can depend on AI-generated faces.
Writing in the journal Psychological Science, the team describe how they carried out two experiments. In one, white adults were each shown half of a selection of 100 AI white faces and 100 human white faces. The team chose this approach to avoid potential biases in how own-race faces are recognized compared with other-race faces.
The participants were asked to select whether each face was AI-generated or real, and how confident they were on a 100-point scale.
The results from 124 participants reveal that 66 percent of AI images were rated as human compared with 51 percent of real images.
The team said re-analysis of data from a previous study had found people were more likely to rate white AI faces as human than real white faces. However, this was not the case for people of color, where about 51 people of both AI and real faces were judged as human. The team added that they did not find the results were affected by the participants’ race.
In a second experiment, participants were asked to rate AI and human faces on 14 attributes, such as age and symmetry, without being told some images were AI-generated.
The team’s analysis of results from 610 participants suggested the main factors that led people to erroneously believe AI faces were human included greater proportionality in the face, greater familiarity and less memorability.
Somewhat ironically, while humans seem unable to tell apart real faces from those generated by AI, the team developed a machine learning system that can do so with 94 percent accuracy.
Clare Sutherland, co-author of the study from the University of Aberdeen, said the study highlighted the importance of tackling biases in AI.
“As the world changes extremely rapidly with the introduction of AI, it’s critical that we make sure that no one is left behind or disadvantaged in any situation — whether due to ethnicity, gender, age, or any other protected characteristic,” she said.
Oct. 14 to Oct. 20 After working above ground for two years, Chang Kui (張桂) entered the Yamamoto coal mine for the first time, age 16. It was 1943, and because many men had joined the war effort, an increasing number of women went underground to take over the physically grueling and dangerous work. “As soon as the carts arrived, I climbed on for the sake of earning money; I didn’t even feel scared,” Chang tells her granddaughter Tai Po-fen (戴伯芬) in The last female miner: The story of Chang Kui (末代女礦工: 張桂故事), which can be found on the Frontline
There is considerable speculation among foreign, and particularly US, observers of when Beijing will launch an invasion — made existential with Monday’s military drills around Taiwan. The most famous is the “Davidson window,” named after then admiral leading the US Indo-Pacific Command Philip Davidson, who told a Senate Armed Services Committee in 2021: “I think the threat is manifest during this decade, in fact in the next six years.” Chinese leader Xi Jinping (習近平) has told the People’s Liberation Army (PLA) to be ready to invade Taiwan by 2027. CIA Director William Burns, however, has said that a military conflict is
President William Lai’s (賴清德) National Day speech was exactly what most of us expected. It was pleasant, full of keywords like “resilience” and “net zero” and lacked any trolling of the People’s Republic of China (PRC). Of course the word “Taiwan” popped up often, and Lai reiterated the longtime claim of his Democratic Progressive Party (DPP), a claim that now dates back 30 years on the pro-Taiwan side. But it was gentle. Indeed, it was possible to see the speech as conciliatory, leaving room for the PRC to make a gesture. That may have been one of its purposes: if
The Taiwan International Queer Film Festival (台灣國際酷兒影展) is back for the eleventh year with its boldest lineup to date, emphasizing trans and non-binary stories and encompassing the full scope of queer experiences through queer cinema. “The festival looks at which LGBTQ+ issues are being discussed around the world and within Taiwan. This year we want people to understand that there’s more than just L, G, B and T,” said Vita Lin (林杏鴻), co-founder and director of the Taiwan International Queer Film Festival (TIQFF). Running in Taipei from Thursday to Oct. 27, before moving to Taichung and Kaohsiung, the festival presents over 20