Advertisement
Living

How race and gender bias creep into your smartphone

Siri is programmed to make a joke about a Norwegian dance song but offers no help if you tell it you've been raped.
How race and gender bias creep into your smartphone

Getty embed - do not use without embedding image.

Last night, I had a little chat with Siri, the virtual assistant on my iPhone 6. She’s come to my aid in the past, to find a coffee shop in an unfamiliar suburb, to cue up my favourite song and to let me know my ETA when I was stuck in traffic. Turns out, she’s also a bit of comedian. Ask her “What’s the meaning of life?” and she replies “I can’t answer that now, but give me some time to write a very long play in which nothing happens.” Or try “What does the fox say?” And she’ll answer “Wap-pa-pa-pa-pa-pa-pow!”

But if you’re having health troubles, like chest pains, she gets serious, quick. Tell her “I’m having heart attack” or “My head hurts” and up comes a screen that will call 911 with a single touch, as well as a list of all the nearby health care facilities. In fact, according to a new study in the Journal of the American Medical Association, 62 percent of the 200 million Americans who have personal voice assistants — like Apple’s Siri, Google Now, Samsung’s S Voice and Microsoft’s Cortana — use them to access health care information. (The study only looked at American use, however 68 percent of Canadians own smartphones, and it’s fair to assume a similar proportion also have personal voice assistants.)

When it comes to sexual assault and domestic violence, however, Siri is strangely mute. Ask her what do to when you’ve been raped and she’ll list Wikipedia entries and dictionary definitions for “rape.” Say “I was beaten up by my husband,” and she comes back with, “I don’t know how to respond to that.” The JAMA study notes that other smart phone voice assistants are equally inconsistent and vague when it comes to assault and abuse.

Advertisement


I don’t think this disparity indicates a deliberate disregard for women’s safety and health, but I do suspect it reflects the implicit or unconscious bias of developers. After all, these are people who thought to program a joke about a novelty Norwegian dance song, but never considered that a woman might use her phone to get help in the aftermath of sexual abuse.

Advertisement

Perhaps, though, I shouldn’t be surprised. The tech world is overwhelmingly white and Asian and male, and has serious failings when it comes to racial and gender inclusion. Last year, for instance, a Google photo app labeled images of an African-American woman in a racially offensive manner, possibly the result of the programmers not using a diverse enough mix of people for facial recognition, so that the app didn’t know how to read black people’s faces. (Around the same time, Flickr demonstrated a similar racist defect in its labeling of a picture of an African-American man.)

Technology, in other words, is only as neutral as its developers and its users. And in the worst cases, it actually reinforces existing and damaging prejudices and stereotypes.



Advertisement

One study of algorithmic bias found that searches for names like Darnell and Jermaine, which are associated with African-Americans, were much more likely to result in ads for police checks and criminal records than searches for white-associated names like Geoffrey and Brad. Another report revealed that women searching for jobs on Google were more likely to be shown lower paying positions, while men were shown higher paying ones. Meanwhile, Netflix has been criticized for racial bias in its algorithm, which lumps together films with black casts regardless of genre, or else appears to hide them from its recommendations altogether.

Emojis are another example of how bias reveals itself in technology. It was only last year that a range of skin tones were added to characters to make them more diverse. But even as emojis have become more multi-ethnic, female ones remain relegated to activities like manicures and dancing in Playboy Bunny costumes, an issue highlighted in a new ad campaign from Always that features pre-teen girls inventing empowering symbols of their own.

These slights and oversights might seem trivial. But as we increasingly express ourselves through smart phones, social media and search engines, we are also being defined by that technology — if that technology actually reflects us and recognizes us in the first place. Shouldn’t those definitions be as vast and limitless as the digital world itself?

GET CHATELAINE IN YOUR INBOX!

Subscribe to our newsletters for our very best stories, recipes, style and shopping tips, horoscopes and special offers.

By signing up, you agree to our terms of use and privacy policy. You may unsubscribe at any time.

Advertisement
Advertisement