Opinion: our often unquestioning use of products like Alexa and Siri has created problems around privacy, security and other issues
Contemporary society is witnessing unprecedented technological change. While many of these changes might appear to have social development as their goal, there are other consequences. Automotive systems are transforming our homes such as grass-cutting robots, smart heating systems and more. Multinational corporations such as Amazon, IBM, Apple, Microsoft and Google are behind this new wave of time-saving domestic technological services
One of the most popular products on the current market is Alexa, Amazon's virtual home assistant. Alexa is a female-voiced assistant that speaks with you in a conversational manner. The use of a wake-word, usually the name "Alexa", activates the device and Alexa's voice responds to our natural language requests. These requests correspond to a number of pre-programmed commands and queries. Alexa also has the ability to learn new commands and understand various accents. Most people know she carries out simple tasks like creating shopping lists, providing real time information on weather, sports or news, setting alarms and playing our favourite songs.
But most people are unaware of the array of other tasks she is capable of performing and her potential for new tasks to be created through the Alexa skills kit. For example, did you know that Alexa can call people from your contact list on your phone by looking through your contact list to see who also has Alexa enabled devices? Or that you can create unique skills to suit your own needs on the Alexa skills kit app.
But what is Alexa? What processes occur when she tells us about the weather? Does she really bring great increased efficiency to our lives or are we introducing unknown risks or threats into our home? We have never seen technological services like Alexa before so we cannot yet fully know the social impacts of these devices.
When people think of Alexa, they are usually thinking of physical devices like the Amazon Echo, but Alexa is a cloud-based service and not a physical device at all. Like Apple's voice assistant Siri, you can’t actually go out and buy an Alexa, but rather the device or smartphone through which she operates.
From RTÉ News, report on how Amazon staff listen to Alexa recordings
This contributes to a misdirection in how we perceive the Amazon product. In order to use Alexa, you need a device that has access to the cloud-based service through which the voice assistant operates. Therefore, devices like the Echo or your smartphone, devices that we buy, are devices that use voice recognition, natural language processing and speech synthesis to interpret user commands and feed them to Amazon’s cloud-based service. All of this is done in a manner that is designed to integrate them as seamlessly as possible into our daily lives.
Tech companies are extremely careful when marketing and designing their products. Market and user-experience (UX) research provide them some insight into what customers will respond more favourably to. An example of this is the female voice for Alexa. Daniel Rausch of Amazon’s Smart Home division explained that findings show '’a woman’s voice is more sympathetic".
However, such findings show problematic perceptions around gender and behavioural expectations with users favouring "non-threatening" and "subservient" personas for their virtual assistants. Presenting such personas with a female voice raises concerns. It is one thing for societal perceptions to carry gender stereotypes, but another to have such stereotypes perpetuated by the gendering of AI services such as Alexa. In fact, automated devices such as Siri and Alexa play socialisation roles in the lives of young children. A new feature designed for Alexa to apologise for not understanding provides early learning that female voices should apologise when they do not understand regardless of the clarity of instruction or context.
From RTÉ Radio 1's News At One, The Irish Examiner's Jess Casey reports that Apple has apologised for allowing contractors to listen to Siri recordings
There are also other problems than perpetuating gender stereotypes. These include the potential for security and privacy breaches. Contractors hired through Globetech by Apple in Cork were discovered to be listening to daily conversations as a routine part of Apple’s effort to improve Siri’s natural language intelligence. The practise was discontinued due to the negativity around such breaches of domestic privacy without user knowledge. However, we tend to only know about such privacy breaches after they are discovered showing a lack of regulation enforcement or compliance.
But there is one issue that has not yet been addressed and that is Amazon’s ability to get users and customers - us, basically - to work from them for free. We even pay them for the privilege. By creating our own commands and interacting with Alexa on a daily basis, users are in fact voluntarily working for Amazon to improve the Alexa product. It is quite remarkable that our own labour is being marketed back to us as a product which will make our lives easier, into which we need to devote our time, effort and pay for the privilege.
Our consumer focus on the supposed benefits technological innovation brings us conceal far more than their limited services provide. We will benefit from remaining aware of the wider societal implications and impacts of each new product that seeks a place in our homes. While technological innovation will improve many lives, we need to ensure we are not perpetuating existing inequalities and stereotypes to a point where their entrenchment becomes too difficult to rectify. Just think of the ubiquity of social media now, for instance. The new rise of automated augmented intelligences and their integration into daily lives urgently needs more independent and publicly funded research to mediate against such negative consequences as those mentioned above.
Emily Phelan is a research assistant an dCriminology Graduate from the Department of Sociology & Criminology at UCC. Dr James Cuffe is a lecturer and anthropologist with a research focus on social control and the social effects of technology with the Department of Sociology & Criminology at UCC
The views expressed here are those of the author and do not represent or reflect the views of RTÉ