Siri Off The Hook After "Abortion" Requests Come Up Empty
December 2, 2011
Siri is repeatedly proving that marketing dreams can fast become PR nightmares.
After sporadic bouts of service outage and legacy outrage, a new issue--one that's far more ethically and politically charged--has much of the public up in arms.
See, it appears that Siri is unabashedly pro-life.
Originally uncovered by The Abortioneers, it seems Siri simply will not help when asked for directions to (or even information about) nearby abortion clinics. Instead, the digital assistant directs users to any number of surrounding pro-life establishments.
Yikes.
But, as bad as that sounds on the surface, the sensationalistic and reactionary responses from various pro-choice associations--including this piece from the ACLU and this popular ignorant petition protesting Apple's promotion of "Anti-Abortion Extremism"--are dramatically misleading and only serve to make things worse.
Fortunately, not every group concerned is so quick to jump to conclusions. Some, like the National Abortion Rights Action League (NARAL), have put forth far more tempered, reasonable requests. NARAL even suggests a technical interpretation of the potential root problem, noting that users are being inadvertently directed to "anti-choice organizations known as 'crisis pregnancy centers'(CPCs)." NARAL explains:
Anti-choice groups created CPCs to look like comprehensive health clinics, but many do not provide women with accurate pregnancy-related information. ... Many of these centers are not up front about their anti-abortion, anti-contraception agenda when advertising online or in other channels. For instance, many CPCs do not disclose their bias to women who walk through their doors or call their toll-free lines seeking information. Ultimately, many of these centers can be harmful and do nothing to help women locate the services they requested from Siri.Since Siri uses traditional search services built around these same affected advertising channels, it's little wonder the top results are skewed. 9to5Mac's Jake Smith reminds us that,
[j]ust like when Google’s search algorithm pulls up manipulated results (like Santorum), Apple and its location partners like Yelp are pulling up manipulated results.And that, folks, is all this is: a beta software hiccup compounded by a specific search manipulation issue. Better still, Apple's already working on a fix, as spokeswoman Natalie Kerris released the following statement earlier today (via The New York Times' Bits blog):
Our customers want to use Siri to find out all types of information, and while it can find a lot, it doesn’t always find what you want. ... These are not intentional omissions meant to offend anyone. It simply means that as we bring Siri from beta to a final product, we find places where we can do better, and we will in the coming weeks.Until then, let's all just take a breath, calm down, and give Siri a break, eh? [Lead image credit: gizmodo.com]