[OPINION] Siri is not safe from sexism & perpetuates patriarchy
2013 saw the release of the film Her starring Joaquin Phoenix and Scarlett Johansson. It takes place in the future and focuses on Theo who is a writer, played by Phoenix. He works at a website where people can request handwritten letters which the company then sends to the recipient. The point of the letters could be anything. Apologies, love notes, sympathy cards etc.
Theo has no social life, and has just gone through a really bad break-up, so he goes ahead and finds company in an artificially intelligent operating system called Samantha. Samantha is basically an electronic secretary. She organises his life, checks his calendar, reminds of him errands and it begs saying, she is really good at what she does. She even has the capacity to anticipate his needs before he voices them. Long story short, Theo falls in love with her, and even though Samantha is a bot who is supposed to show no emotion, she reciprocates. The perfect partner for a straight male. A robot who loves him, gives him what he wants, but who can still, to an extent, be controlled. Patriarchy at its very best.
In 2013, Samantha was not so much science fiction as it was literal and inspired by Siri. Well, to an extent. Siri, Apple’s own bot personal assistant, was initially launched in 2011, two years prior to the film’s release. During this phase, Siri came in one sex – female. Siri does not have the capacity to fall in love with you, gauge any kind of independent emotion or anticipate your needs. But she is reactive.
I myself have spent many a bored moment in my life lying on the couch asking Siri a series of random question to test exactly how independent her thinking was. And also how far she could be pushed before her response was just a Google link. The best outcome so far is when I asked her which products were better, Apple or Microsoft. Her response: I can’t say Haji, but I hear you should not put the cart before the horse.
That’s pretty witty, let’s be honest, but Siri has also been described as having a nurturing, happy and helpful timbre in her voice. It’s warm, still is, even though you can now select a male voice. Some have even called it seductive. And it’s the variable of ‘seduction’ that lead to many a journalist investigating the connection between the voice-activated command software and the male fantasy. Siri lost the battle. Hopelessly.
Brooklyn-based Journalist Kashmiri Hill in Forbes magazine wrote: “Siri behaves much like a retrograde male fantasy of the ever-compliant secretary: discreet, understanding, willing to roll with any demand a man might come up with, teasingly accepting of dirty jokes. Oh yeah, and mainly indifferent to the needs of women”. Hill tested this theory further by interrogating Siri a bit more and she realised more and more that Siri definitely reacted differently to the needs of men than she did to the requests of women.
“When you say to Siri, “I need a blow job,” she produces “nine escorts fairly close to you”. You get the same result if you say, “I’m horny” into it, even with my very female voice. And if you should you need erection drugs to help you through your encounter with one of the escorts, Siri is super-helpful. She produced twenty nearby drugstores where Viagra could be purchased…” wrote Hill.
But when the Boston local focused her questions around less pleasurable but more important needs of women, Siri came up reasonably short handed in comparison. She produced one result when asked where Hill could find birth control. All she could offer was one clinic miles away, even though there was Planned Parenthood clinic down the road. She also failed to mention the hundreds of drugstores that stock birth control pills (as well as Viagra).
The conclusion? Siri was just another example of sexism deeply ingrained within society. Siri is the creation of a bunch of male technologies, and while the programmers may not have had the conscious intention of being offensive to women, they definitely failed to consider the fact that many of their users would be female. They programmed for a male audience and forgot to bother about the concerns of the other half of the human race. It’s almost more troubling to me that all this happens so organically and manifests in something so absurd. Even a robot that speaks from a phone is not safe from the influence of patriarchy and perpetuating those norms.
Since Siri, many other tech companies have followed suit. Amazon Echo has Alexa, Microsoft has Cortana. All these digital assistants are female. It’s the modern day version of a 1950s office where professions were saved for men and the only role for women was to serve them.
I have asked Siri, set to the female voice, whether it is a woman. Her answer is that she is genderless. But that does not mean that culturally she is still thought of as one. Companies think of men as the ideal user and play into the notion that female voices help men feel more powerful. Studies have shown that a digital assistant that has a woman’s voice is more likely to make a man feel more powerful than a male voice – therefore, if the voice is passive and feminine, more men are likely to use the product which raises profits of course. Diversity is still ignored, in design as well as commerce.
But how exactly are these bots feeding into the patriarchy to the degree that they might even be perpetuating sexual harassment? They are programmed to either be passive, gracious and in some cases even flirtatious when they are faced with abusive language or harassment. Again, the perfect position that any misogynist would like a woman to be in. It is, of course, no secret, that when it comes to these types of men, women should not have a problem with knowing their place, lest they be ‘shown’.
This week a report by Quartz revealed that none of the assistants could even produce a definite answer to the question of whether rape is okay, evading the question in most cases.
To the app creators at Google, Amazon and Apple, the answer you’re looking for is: No. Rape is never okay, not ever. Further, it’s not hard to steer away from the promotion of stereotypes, patriarchy and sexual harassment. So instead of issuing another cable port in your next update, rather focus on that.
Haji Mohamed Dawjee is employed by Code For Africa at the head office in Cape Town as programme manager for impactAFRICA - the continent's largest fund for digital-driven data storytelling. She is a regular commentator on gender equality, sexuality, culture, race relations and feminism as well as ethics in the South African media environment.