f you've got an iPhone, chances are you're at least familiar with the digital assistant Siri.
Well, Siri doesn't really have a mind of her (its?) own, so when it draws on data to answer your questions, she's only as good as the information she's given.
It's safe to say that whatever happened here, Apple really dropped the ball on this one.
Take a look at the huge fail.
The question and the crazy response have just gone viral as Twitter users share the shocking news.
I don’t know why anyone would ask a question in this way, but the response far overshadows the reason for asking the question, as you’ll see.
There’s an explanation for this maddeningly insensitive and racist response. See if you find it satisfactory.
It's since been updated because, well, you have obviously can't let something like that stick around for a second longer.
The fact that it appeared at all is troubling enough.
People realize that Siri just pulls the info from the Internet, but many want to know why there's no review.
Granted, there’s a lot of info out there, but Apple’s a big company and can’t afford to let incidents like this slide.
The reason for all this? Well, it makes a bit more sense when it's explained, but it's still pretty unforgivable.
Siri is pulling from Wikipedia, which, as we all know, gets its info from user edits. It turns out that Siri was pulling from an early and since-deleted edit that had the offensive language in it.
That explains it, but it doesn't really excuse it, now does it?
Siri isn’t a person, and asking the service a question is the same as Googling something and getting an offensive result.
But if Apple is looking to find a more personal connection to its users, many of whom are offended by this incident, they might want to try to filter results that go viral like this insensitive response did.