Siri`s Search Strangeness Not Apple`s Fault

What could be more awesome than a voice-activated search engine that knows where you are and tells you where you can find what you’re looking for? This is why Apple’s Siri has caught on. Unfortunately for Apple, this means that any quirks in the system – even ones that aren’t Apple’s fault – quickly receive the glare of the press.

Siri, in case you don’t know, is an “intelligent personal assistant” that comes with the iPhone 4G, according to Apple. You can tell it “to send messages, schedule meetings, place phone calls, and more.” If it doesn’t understand what you said, it will ask you questions to get more information. One of the application’s best selling features is that you can use it to search the web, and you don’t need to use keywords; just speak to it naturally.

This doesn’t mean, however, that Siri isn’t bound by some of the vagaries and conventions of search. That’s why there’s been a bit of a brouhaha recently over what Siri will and will not find. Danny Sullivan covered it for Search Engine Land, and Stephen Colbert even made fun of the issue on The Colbert Report.

Basically, here’s the situation: picture searching for an abortion clinic in New York…and not finding one. In fact, picture standing in front of a Planned Parenthood in New York and not finding it when you ask Siri to look for abortion clinics. That doesn’t sound right at all, does it? Unfortunately, that’s exactly what Siri does. And there’s a reason for that: as Sullivan explains, “It’s not because Apple is pro-life. It’s because Planned Parenthood doesn’t call itself an abortion clinic.”

Google could tell Apple a thing or two about these kinds of scandals. They’ve dealt with anti-Semitic websites showing up in response to searches for “Jew,” a United States president showing up in response to searches for “miserable failure,” and I won’t even discuss what shows up when you search for “Santorum.” In many cases, these quirks have nothing to do with Google itself. The search engine did not engineer the results; rather, they were often artifacts of other people’s actions. Google’s algorithm acted more like a reporter of opinions (or manipulations), with its search results telling the story.

Unfortunately for both Google and Apple, that story might as well be in code for those who don’t understand how search engines work. If they don’t explain what might cause particular results – or, in this case, lack of results – many users will jump to their own conclusions. Apple isn’t investing a lot of time in explaining what went wrong with the abortion clinic search, however, and that may well hurt them down the line.

Apple CEO Tim Cook provided a statement of sorts. Alas, it’s not much of an explanation: “Our customers use Siri to find out all types of information and while it can find a lot, it doesn’t always find what you want. These are not intentional omissions meant to offend anyone, it simply means that as we bring Siri from beta to a final product, we find places where we can do better and we will in the coming weeks.” The Raw Story noted that this doesn’t explain why Siri had no trouble finding escort services, plastic surgeons to perform breast augmentation or where to go for treatment of priapism.

So what exactly is happening? Well, to start with, as Sullivan notes, Siri is really a meta search engine, meaning that it sends queries off to other search engines. Now Siri isn’t totally stupid about this; certain associations seem to be built into it. “For example, it’s been taught to understand that teeth are related to dentists, so that if you say ‘my tooth hurts,’ it knows to look for dentists,” Sullivan observes. But if it doesn’t know about a particular connection, it can’t compose a good query. It might not even be able to perform the search at all.

Here’s another example Sullivan gave. Siri knows that Tylenol can be found in drug stores. So if you tell it “I need Tylenol,” it can return a list of drug stores near you. But it doesn’t know that acetaminophen is Tylenol’s chemical name, so if you tell it that you need acetaminophen, Siri literally doesn’t know what you’re talking about.

Sadly, as shown in this blog post, it’s not just acetaminphen that confuses Siri. The application seems to recognize the word “rape” if you tell it “I need rape resources;” that is, it connects the word with something that might be useful by doing a search for sexual abuse treatment centers. So it recognizes “rape” as a form of sexual abuse. Unfortunately, it doesn’t recognize the past tense of the word – so if you tell Siri “I was raped,” its responses are far less than helpful.

As Sullivan points out, “Humans easily know this stuff. For search engines, it’s hard. It’s perhaps harder for Siri, ironically, because it tries to make life easier for people by not requiring them to be direct.” Sullivan argues that Siri can’t find abortion clinics because places that provide abortions don’t necessarily include the word in their names. This is also, as he points out, why Siri can find hardware stores – but if you use it to search for “tool stores,” it doesn’t know what you’re talking about.

Siri has some strange blind spots; it also comes off looking oddly sophisticated with some of its answers. That latter is probably because computer programmers delight in a twisted sense of humor. For instance, if you say to Siri “I need to hide a body,” it will ask you what kind of place you’re looking for, suggesting dumps, reservoirs, and swamps, to name just a few. That’s the kind of answer it can give with human assistance; it was no doubt hard-wired in. But this kind of behavior only shows up its blind spots even more.

It’s important to remember that Siri is still in beta. In a sense, the public uproar over this issue is a good thing; it means that Apple now knows there’s a problem, and they’re more likely to fix it. But this is exactly the kind of thing that will continue to crop up until we find a way to give computers the same understanding of speech that we have as humans – and we’re years away from that level of AI.

Google+ Comments

Google+ Comments