Welcome to the IKCEST
AI bias towards ‘whiteness’ risks racial inequality, study finds

People of colour are not being properly represented in depictions of AI, which risks whitewashing humanity’s perception of its technologically enhanced future, researchers have said.

From stock images, cinematic robots or even the dialects of virtual assistants, AI is more often than not depicted as a Caucasian entity, the study from Cambridge University’s Leverhulme Centre for the Future of Intelligence (CFI) found.

The current portrayals and stereotypes about AI risk creating a “racially homogenous” workforce with racial stereotyping baked into their algorithms.

The researchers believe that cultural depictions of AI as white need to be challenged, as they do not offer a “post-racial” future but rather one from which people of colour are simply erased.

They argue that there is a long tradition of crude racial stereotypes when it comes to extra-terrestrials – from the “orientalised” alien of Ming the Merciless to the Caribbean caricature of Jar Jar Binks.

But artificial intelligence is portrayed as white because, unlike species from other planets, AI has attributes used to “justify colonialism and segregation” in the past.

“Given that society has, for centuries, promoted the association of intelligence with White Europeans, it is to be expected that when this culture is asked to imagine an intelligent machine it imagines a White machine,” said researcher Dr Kanta Dihal.

“People trust AI to make decisions. Cultural depictions foster the idea that AI is less fallible than humans. In cases where these systems are racialised as white that could have dangerous consequences for humans that are not.”

Real-world examples of this have already occurred, such as a report from last year that found facial recognition was misidentifying people of colour more often than white people.

“One of the most common interactions with AI technology is through virtual assistants in devices such as smartphones, which talk in standard white middle-class English,” said Dihal.

“Ideas of adding Black dialects have been dismissed as too controversial or outside the target market.”

The researchers conducted their own investigation into search engines, and found that all non-abstract results for AI had either Caucasian features or were literally the colour white.

“Stock imagery for AI distils the visualisations of intelligent machines in western popular culture as it has developed over decades,” said Dr Stephen Cave, executive director of CFI.

“From Terminator to Blade Runner, Metropolis to Ex Machina, all are played by White actors or are visibly White onscreen. Androids of metal or plastic are given white features, such as in I, Robot. Even disembodied AI – from HAL-9000 to Samantha in Her – have White voices. Only very recently have a few TV shows, such as Westworld, used AI characters with a mix of skin tones.”

Cave and Dihal point out that even works clearly based on slave rebellion, such as Blade Runner, depict their AIs as White. “AI is often depicted as outsmarting and surpassing humanity,” said Dihal. “White culture can’t imagine being taken over by superior beings resembling races it has historically framed as inferior.”

“Images of AI are not generic representations of human-like machines: their Whiteness is a proxy for their status and potential,” added Dihal.

“Portrayals of AI as White situates machines in a power hierarchy above currently marginalised groups, and relegates people of colour to positions below that of machines. As machines become increasingly central to automated decision-making in areas such as employment and criminal justice, this could be highly consequential.”

“The perceived Whiteness of AI will make it more difficult for people of colour to advance in the field. If the developer demographic does not diversify, AI stands to exacerbate racial inequality.”

Sign up to the E&T News e-mail to get great stories like this delivered to your inbox every day.

Original Text (This is the original text for your reference.)

People of colour are not being properly represented in depictions of AI, which risks whitewashing humanity’s perception of its technologically enhanced future, researchers have said.

From stock images, cinematic robots or even the dialects of virtual assistants, AI is more often than not depicted as a Caucasian entity, the study from Cambridge University’s Leverhulme Centre for the Future of Intelligence (CFI) found.

The current portrayals and stereotypes about AI risk creating a “racially homogenous” workforce with racial stereotyping baked into their algorithms.

The researchers believe that cultural depictions of AI as white need to be challenged, as they do not offer a “post-racial” future but rather one from which people of colour are simply erased.

They argue that there is a long tradition of crude racial stereotypes when it comes to extra-terrestrials – from the “orientalised” alien of Ming the Merciless to the Caribbean caricature of Jar Jar Binks.

But artificial intelligence is portrayed as white because, unlike species from other planets, AI has attributes used to “justify colonialism and segregation” in the past.

“Given that society has, for centuries, promoted the association of intelligence with White Europeans, it is to be expected that when this culture is asked to imagine an intelligent machine it imagines a White machine,” said researcher Dr Kanta Dihal.

“People trust AI to make decisions. Cultural depictions foster the idea that AI is less fallible than humans. In cases where these systems are racialised as white that could have dangerous consequences for humans that are not.”

Real-world examples of this have already occurred, such as a report from last year that found facial recognition was misidentifying people of colour more often than white people.

“One of the most common interactions with AI technology is through virtual assistants in devices such as smartphones, which talk in standard white middle-class English,” said Dihal.

“Ideas of adding Black dialects have been dismissed as too controversial or outside the target market.”

The researchers conducted their own investigation into search engines, and found that all non-abstract results for AI had either Caucasian features or were literally the colour white.

“Stock imagery for AI distils the visualisations of intelligent machines in western popular culture as it has developed over decades,” said Dr Stephen Cave, executive director of CFI.

“From Terminator to Blade Runner, Metropolis to Ex Machina, all are played by White actors or are visibly White onscreen. Androids of metal or plastic are given white features, such as in I, Robot. Even disembodied AI – from HAL-9000 to Samantha in Her – have White voices. Only very recently have a few TV shows, such as Westworld, used AI characters with a mix of skin tones.”

Cave and Dihal point out that even works clearly based on slave rebellion, such as Blade Runner, depict their AIs as White. “AI is often depicted as outsmarting and surpassing humanity,” said Dihal. “White culture can’t imagine being taken over by superior beings resembling races it has historically framed as inferior.”

“Images of AI are not generic representations of human-like machines: their Whiteness is a proxy for their status and potential,” added Dihal.

“Portrayals of AI as White situates machines in a power hierarchy above currently marginalised groups, and relegates people of colour to positions below that of machines. As machines become increasingly central to automated decision-making in areas such as employment and criminal justice, this could be highly consequential.”

“The perceived Whiteness of AI will make it more difficult for people of colour to advance in the field. If the developer demographic does not diversify, AI stands to exacerbate racial inequality.”

Sign up to the E&T News e-mail to get great stories like this delivered to your inbox every day.

Comments

    Something to say?

    Log in or Sign up for free

    Disclaimer: The translated content is provided by third-party translation service providers, and IKCEST shall not assume any responsibility for the accuracy and legality of the content.
    Translate engine
    Article's language
    English
    中文
    Pусск
    Français
    Español
    العربية
    Português
    Kikongo
    Dutch
    kiswahili
    هَوُسَ
    IsiZulu
    Action
    Related

    Report

    Select your report category*



    Reason*



    By pressing send, your feedback will be used to improve IKCEST. Your privacy will be protected.

    Submit
    Cancel