X
Business

Why the problem with Siri matters

Controversy continues for Apple's virtual personal assistant Siri and its behavior around female sexual healthcare crises.
Written by Violet Blue, Contributor

A week ago, bloggers brought to light that Apple's voice-controlled mobile assistant Siri was delivering distressful omissions when asked about certain kinds of emergency healthcare situations.

Specifically, situations that relate to female sexual emergencies. American bloggers in different locations were asking Siri to help them find abortion clinics, the morning after pill, and rape crisis resources. When Siri was delivering results at all, they were often the offensive opposite.

If you missed the controversy, Siri was branded across news headlines as being "pro-life" (the questionable self-name for anti-abortion crusaders, widely known for their violence, aggression and fanaticism).

The story is no longer breaking but it's certainly still unfolding. The ACLU has a petition and is not letting the issue fade.

The reason Siri got its pro-life association was that the program didn't just return zero results for abortion clinics when it should have been. In multiple instances Siri directed users to pro-life "pregnancy counseling centers" whose primary motivations are to talk women out of considering certain sex-related healthcare options including abortion, contraception and more.

Siri also couldn't find a clinic that performs abortions even when asked for a specific women's healthcare clinic by name.

Many pointed out that when asked for Viagra, Siri had no problem getting users to erection pills and the same for finding sex workers - and humorous places to hide their dead bodies when you were done. Yet Siri had no idea what morning after pills were, and queries about rape resources returned responses that made Siri seem like it was making fun rape victims.

The main issue was that Siri appeared to be skewed pro-life.

Everyone, Get Your Tinfoil Hats

It would be much less murky if Apple's founder wasn't a newly minted pro-life hero. Catholic websites and pro-life blogs celebrated the news about Siri. USA Catholic said that "the pro-life iPhone" was "another case for the sainthood of Steve Jobs."

The first case for his sainthood cited by anti-choice blogs and websites was the sentiment Jobs expressed in his biography that he was glad his mother didn't choose abortion. (Although this is a curious thing to say, as abortion wasn't an option in the United States until it was legalized in 1973; Jobs was born in the mid-1950s.)

The same pro-life and "family" groups had also applauded Jobs' strong anti-pornography stance.

Accusing Apple of anti-choice malfeasance is a stretch, to put it lightly. And despite the wet dreams of many pro-life blogs and religious websites, the political position of Steve Jobs on abortion is unknown.

After the issue made mainstream press, Apple had the really good idea of not letting Siri give the response.

Instead Apple told the New York Times, "These are not intentional omissions meant to offend anyone. It simply means that as we bring Siri from beta to a final product, we find places where we can do better, and we will in the coming weeks."

That's a great answer. Especially because it doesn't step in the mess of exactly what omissions they're talking about, and with the 'it's beta' excuse users lose permission to complain because they're willingly using a product that was knowingly released as unfinished.

Programmer, please. "It's beta" is for startups on the playground. "It's beta" rings hollow on a product that has high-end TV commercials, especially an Apple product, and moreso when Siri itself is often a key buying decision for the new iPhone.

Siri has an interesting backstory. It began as a startup app that emerged from stealth in 2008, a smart "virtual personal assistant" for iPhone with a planned Android release. Well, at least that was the plan until it was acquired by Apple in April 2010. No one's saying what Apple has done to Siri since the purchase.

Siri was midwifed by the CALO project, and her twin partner was Nuance. An important side detail here is that one of Nuance's main defined verticals is healthcare.

Still, There's Something [Icky] About Siri

It's pretty hard to get around at least one theme in the whole Siri female sex crisis dustup.

It's just icky that (in the US) Siri is a woman that does your bidding, and you're definitely in luck if you're a guy that wants "her" to fetch you penis pills to get it up and a sex worker for getting off. But if you get raped, it's kind of like sorry, sister.

I still think we should never suspect bad intentions when the explanation is simple ignorance. And hey - Siri isn't actually that great for critical tasks; that's no big secret.

But let's take another look at why Siri isn't helping out her American sisters when they're having a sexual healthcare crisis.

If you can get through the defensive histrionics over at The Unofficial Apple Weblog, we're told that Siri's got no social or political agenda, but her flawed results are to be blamed on retrieving results from Yelp and other databases that have user-generated tagging systems.

Unfortunately, the well-meaning TUAW article has too many "if Siri works this way" explanations to be conclusive either way - we're still left with maybe this is what's going on here.

With much lower blood pressure, Search Engine Land did a careful analysis about the way Siri might be pulling results from places that simply don't have exact words (like "abortion") in their names or titles. Danny Sullivan suggests they may not be listed in the "abortion" categories either, but he adds, "That's the best guess I have."

Sullivan still had to point out that Siri has different behaviors around these touchy topics than it does with others. He did a really great job testing terms and unpacking as much as he could to look closely at Siri's search behavior, and comparing Yelp results.

He came up with some interesting discrepancies.

For instance, there are the Washington DC results where Siri produces two pro-life centers, yet Sullivan had to really hunt to find the same pro-life listings in Yelp search results. He writes, "Woah. What’s going on there? I don’t know."

In the same vein, while Apple's PR department might have fed those clever and humorous easter egg questions to tech bloggers, their statements in response to the female sexuality flap are weak, vague. I strongly agree with Sullivan that with this PR strategy they're not really answering anything - and it doesn't help.

I really like that Danny reminds us Siri doesn't actually "know" what we're talking about, and that Apple is new to search - and search engine PR disasters.

But in his post there is something he only hinted at that I want to tease apart to show you why I think the problem with Siri really matters.

Apple's engineers are clever enough to spend thought, effort and time to make Siri give us funny answers about hiding dead bodies.

But the same workers didn't spare a single minute to make sure a variety of female sexual healthcare crises and emergencies have a minimum of equal consideration.

That is the thing, in all this mess, that I find most astonishing.

This is a real problem.

Editorial standards