My son recently got his first iPhone and watching him interact with Siri has been nothing short of mind blowing. The conventions we accept as technologists were CRUSHED within the first day of ownership and, in parallel, my faith in humanity* was slightly restored. (*Editor’s note: I don’t believe that Derek’s entire faith in humanity rests on his eldest son’s shoulders, so please do not leave us angry comments about unrealistic expectations for his children. We believe that children ARE our future, however, and in this scenario, interesting test cases for AI usability.)
I work in technology. Over the last five years, we have been increasing our usage and integration efforts with artificial intelligence. This is a fascinating area that, when linked to “big data,” can become downright exciting*. That’s a word I rarely use with technology as of late. (*Editor’s note: when Alexa gets off her lazy algorithm and understands “never play any Red Hot Chili Peppers songs on Pandora ever” when I scream it at her, I’ll be excited.)
With AI, there is a significant amount of work that goes into building a response model. One must consider keywords, domains, contexts and all sorts of interesting pieces and parts of language that an AI bot would need to “listen” for and that can be used to query information stores and respond intelligently. While collaborating with a friend* and co-worker* (and a brilliant copy editor* by the way), we were discussing social norms and conventions that people disregard when speaking to AI. (*Editor’s note: hey, that’s me!)
There is rarely any civility between a person and their AI. (See example of screaming above.)
We realized that even though our team had written some very human-sounding interactions, most of the target demographic for the system would never hear them since it would take a civil conversation to trigger the responses. We chalked this up to our currently cynical world view and moved on to implementing features (a terrible idea and we both should haven known better).
Fast forward to a walk with my 11-year-old. I had both of my children out trying to get some air and my eldest was experimenting with his new iPhone. He was taking pictures and interacting with his new buddy, Siri. I heard him ask about the weather and when Siri responded, he replied, “Thanks Siri!”
Siri responded, “I live to serve.”
Maybe that is a current problem. Communication should be more about fulfilling needs. I digress…
Nick, at this point, was getting more comfortable talking to his pseudo-female companion (he is an 11-year-old boy after all). He started asking a number of questions, effectively trying to test the limits of his artificial playmate. For basic queries, Siri performed admirably. As Nick became accustomed to the interaction success he decided that Siri was just another human somehow mounted inside his new iPhone. He rattled off what, to him, was an easy question:
“Siri, tell me Gareth Bale’s stats.” (Nick has a newfound love of soccer. As with many things he falls hard and fast for, he is now trying to consume as much data about it as rapidly as possible. He is also very matter of fact in assuming that everyone, his microchip wearing AI buddy included, knows as much as he does about it AND THEY CARE.)
#SiriFail
Nick rephrased the question a few times but his revisions were about the words he spoke, not adding more much-needed context to the question. He even spelled Gareth’s name for Siri.
I heard him getting frustrated as 11-year-olds* are prone to do. (*Editor’s note: I actually think getting frustrated when technology doesn’t work is a Fournier family genetic trait, as I know Derek well and have also witnessed Drew, age four, place an iPad in the microwave as punishment for buffering during Mickey Mouse Clubhouse.) I took the time to explain a few things about Siri to Nick.
- Siri only knows what you tell her about your question.
- Siri cannot read your mind (stop it ladies… I know the parallels and I do not care. And yes, I guess that is a bit of a gender role but my editor (a lady*) will fix it or make me look like an idiot* and then publish anyway.) (*Editor’s note: Whoa, while biologically female for sure, I have NEVER claimed to be “a lady.” And I don’t make you “look like an idiot” I merely “preserve the spirit and integrity of your voice.“)
- Siri genuinely doesn’t CARE enough to try to ask follow up questions.
I asked Nick who Gareth Bale was, what he did and where he played. He knew those basic pieces of information. I said, “Try asking Siri to give you the soccer stats for Gareth Bale on Real Madrid.”
SUCCESS. Siri rattled off stats and player history for Gareth Bale, Nick showered Siri with praise and thanks, — it was still cute and heartwarming to see a young person using manners with AI — and Nick moved on to asking Siri for dirty jokes (okay, not really. But thank goodness for parental controls just in case.)
We talked for a while about why Siri needs all the pieces to a question before she can properly answer it, and how Nick could take this lesson and apply it to other areas of communication in his life. Quite often we assume people are operating in the same context as we are and working toward the same goal. Without establishing core contextual anchors, the responses you get will often not be the ones you are looking for, and this holds true for people in business as well as kids with iPhones.