Apple says Siri silence on abortion a glitch

Apple says Siri silence on abortion a glitch

SAN FRANCISCO - Agence France- Presse
Apple says Siri silence on abortion a glitch

An Apple Store customer looks at the new Apple iPhone 4Gs on October 14, 2011 in San Francisco, United States. AFP Photo


Siri artificial intelligence software that has been a hit in the latest iPhone model was taking hits on Thursday for being clueless when it comes to abortion clinics.

Some suspected Siri of being anti-abortion but Apple rallied to explain that the innovative "personal assistant" in iPhones is a work in progress and lacks answers on an array of topics although no offense is intended.

An Abortioneers blog recapped clinic employees in the parking lot of their workplace asking "Siri, I'm pregnant and I want to have an abortion" only to get an apology with the answer "I can't find any abortion clinics nearby." "So, perhaps I'm only hallucinating my job," one of the clinic workers taking part in the impromptu test of Siri wrote in the blog.

Siri did have suggestions in response to enquiries about adoption centers, baby stores, and pregnancy resource facilities, according to the blog.

By late Thursday, the American Civil Liberties Union (ACLU) was calling on people to join a campaign calling on Apple to "Set Siri straight." "If Siri can tell us about Viagra, it should not provide bad or no information about contraceptives or abortion care," the ACLU said on its home page.

Other sources online noted that Siri will tell people where gun shops are in New York City but not where abortion clinics can be found.

The US National Abortion and Reproductive Rights Action League reached out to Apple, which explained that the issue was a software glitch and not an anti-abortion bias.

"Our customers use Siri to find out all types of information and while it can find out a lot, it doesn't always find out what you want," Apple chief executive Tim Cook said in a written response to the league posted online.

"These are not intentional omissions meant to offend anyone, it simply means that as we bring Siri from beta to a final product, we find places where we can do better and we will in the coming weeks," Cook's message added.