果冻影院

XClose

果冻影院 News

Home
Menu

AI faces look more real than actual human faces

14 November 2023

White faces generated by artificial intelligence (AI) now appear more real than human faces, according to new research co-authored by a 果冻影院 academic.

Human faces

In the study, led by the Australian National University researchers and published in Psychological Science, more people thought AI-generated faces were human than the faces of real people.

Co-author Dr Eva Krumhuber (果冻影院 Psychology & Language Sciences) said: 鈥淎rtificial intelligence has reached an astonishing level of realism, and here we find that sometimes it can even seem more real than reality 鈥 hyperrealism 鈥 so that we can be very easily tricked into thinking an AI-generated face is real.鈥

For the study, the researchers showed 124 participants images of different white faces and asked them to judge if the face was real or generated by the StyleGAN2 algorithm. For the AI faces, participants judged them to be real two-thirds of the time 鈥 more often than for the real faces.

But this pattern was not found in other research using the same algorithm that included faces of people of varying ethnicities. The reason for the discrepancy is that AI algorithms tend to be trained disproportionately on white faces.

Senior author Dr Amy Dawel (Australian National University) said: 鈥淚f white AI faces are consistently perceived as more realistic, this technology could have serious implications for people of colour by ultimately reinforcing racial biases online.

鈥淭his problem is already apparent in current AI technologies that are being used to create professional-looking headshots. When used for people of colour, the AI is altering their skin and eye colour to those of white people.鈥

One of the issues with AI 鈥榟yper-realism鈥 is that people often don鈥檛 realise they鈥檙e being fooled, the researchers found.

Study co-author and PhD candidate at Australian National University, Elizabeth Miller, said: 鈥淐oncerningly, people who thought that the AI faces were real most often were paradoxically the most confident their judgements were correct.鈥

鈥淭his means people who are mistaking AI imposters for real people don鈥檛 know they are being tricked.鈥

The researchers were also able to discover why AI faces are fooling people.

Dr Krumhuber said: 鈥淚n the old days, there would often be inconsistencies between artificial and human-like faces, thereby producing the uncanny valley effect. Particularly, the eyes (i.e., refraction, shadowing) would give away clues as to whether the face is real or artificial. It seems that we鈥檝e now overcome the uncanny valley for static images.鈥

Dr Dawel added: 鈥淚t turns out that there are still physical differences between AI and human faces, but people tend to misinterpret them. For example, white AI faces tend to be more in-proportion and people mistake this as a sign of humanness. However, we can鈥檛 rely on these physical cues for long. AI technology is advancing so quickly that the differences between AI and human faces will probably disappear soon.鈥

The researchers argue this trend could have serious implications for the proliferation of misinformation and identity theft, and that action needs to be taken.

Dr Dawel said: 鈥淎I technology can鈥檛 become sectioned off so only tech companies know what鈥檚 going on behind the scenes. There needs to be greater transparency around AI so researchers and civil society can identify issues before they become a major problem.鈥

Raising public awareness can also play a significant role in reducing the risks posed by the technology, the researchers argue.

Dr Krumhuber, whose research and teaching at 果冻影院 explores the socio-cognitive and affective processes underlying emerging technologies and their influence on the human mind, said: 鈥淕iven that humans can no longer detect AI faces, society needs tools that can accurately identify AI imposters.鈥

Dr Dawel concluded: 鈥淓ducating people about the perceived realism of AI faces could help make the public appropriately sceptical about the images they鈥檙e seeing online.鈥

Links

Source

Image

  • Faces. Source:

Media contact听

Chris Lane

tel: +44 20 7679 9222听/ +44 (0)7717 728 648

E: chris.lane听[at] ucl.ac.uk