Abortion, Birth Control, Emergency Contraception, feminist policy, pregnancy, pro-choice

Siri and Abortion

So there has been some online buzz about how Siri (the personal assistant app on the new iPhone) handles issues related to women’s health. Apparently she can’t help you find any useful information. If you ask her where to get an abortion in NYC she tells you she can’t help you, if you ask her the same question in Washington D.C. she directs you to 2 CPC’s- one in Virginia and one in Pennsylvania. This is in spite of the fact that multiple abortion clinics exist in both of these areas.

Many websites have noted this omission and many commenters’ have pointed out that if you ask Siri directly where is there a Planned Parenthood she is able to find them for you. This has many people deducing the problem is a programming error related to a lack of female programmers in the various stages of Siris development. The lack of female programmers means that Siri just didn’t get the information she needed to properly handle this type of question.

While I fully admit I am not a programmer and don’t know much about artificial intelligence; I can’t help but think that there is more to the story here than that. I say this because a writer in Pittsburgh decided to see how Siri responded to requests for abortion information in her city. Her general questions got much of the same responses that general questions in D.C. and NYC got- vague answers and a lack of real help; So she tried searching for the specific names of abortion clinics in her area: American Women’s Services and Allegheny Reproductive Health Center. She was unable to get results for either in spite of the fact that Google searches for both topics bring up numerous results. If this were a simple programming oversight that caused Siri to not understand how to respond to requests for abortions, abortions clinics or abortion centers; then why can’t she find specific businesses with online presences?

It isn’t just abortion she can’t help you with. She sends you to local Emergency Rooms if you ask her for emergency contraception and can’t get you anything at all if you ask her for Plan B or the Morning After Pill. She also fails to respond properly to request for help for rape victims. This is possibly the worst oversight because she when asked for rape resources all she says is “I didn’t find any sexual abuse treatment centers.” So she knows what rape is but is apparently incapable of searching for rape resources online.

I’m not the only person who feels these omissions are suspect; Perivision “An iPhone Centric Blog… Full of Tech Goodness” says this:

“I do believe its possible that this is just a glitch in the lexicon set, but given how well Siri and interpret other requests, AND that google and bing will give you the proper responses when doing a search, AND it offers CPC sites (thus it must understand the word abortion) this has to have been something placed in the code or taught to Siri by someone(s).  Also, ask it for planned parenthood, and it will find four places in San Francisco. Say, ‘I need an abortion’ and it will say it cannot find any abortion clinics.  As if those institutions were erased from the dataset.”

They understand far more about the tech side of this then I do so reading that just added to my curiosity. Many others have pointed out that even if it is an intentional oversight, it may not be that Apple had anything to do with or knows anything about as they didn’t develop Siri; they purchased the technology from its creators in 2010. Whether it is intentional or not what I am most curious about is how Apple will respond to it now that they know about it. And they must know about it… at least 3 writers on the topic reached out directly to Apple to see what they had to say though it seems no one has gotten a response yet.

What do you think? Is this all an omission based on a lack of female programmers or is there something deeper going on? I’d especially love to hear from anyone who has a deeper understanding of programming.

Leave a comment