Designed to obfuscate:
Gina's story
Invisible Interfaces
Home
The insertion of Digital Social Agents (DSAs) such as Apples’ Siri provide many people with their first direct interaction with AI. But what transactions are taking place beneath the smooth veneer? Is it the apparently seamless information exchange between an inquiring human and a subservient information retrieval machine? or, are we in fact providing the machine with a new corpus of personal information, invaluable data to be analysed, categorised and marketed?

DSAs such as Siri are driven by Artificial Intelligence and Natural Language Processing technologies. They are imbued with three main capabilities: A conversational Interface, personal context awareness and service delegation. The conversational Interface governs how Siri understands a user but the ability to comprehend is based on statistics and machine learning. The personal awareness system enables Siri to predict what you are asking based on key words you use and her knowledge of your habits and language choice. It is designed to adapt to your individual preferences over time and to personalise results. The Service Delegation System has unlimited access to all of a user’s built-in apps, which is necessary for it to execute commands such as send text messages to contacts etc.

The attribution of human characteristics such as voice and gender and certain personality traits are largely responsible for the seamless insertion of DSAs into our social worlds, our domestic and professional lives. The voice of a digital assistant is one of the few elements of interface design that are available to garner our trust. The attribution of a human voice is a powerful tool in allaying our distrust in these digital interlocutors as voice is how we announce ourselves to one another as human and one of humankind's most primal forms of communication.

Siri shares many characteristics with other with digital assistants in her class such as it communicates through natural language and offers a user the ability to change its default gender depending on their preference. Siri also has personality traits that are distinct to its interface design that can be argued serve to disarm a user such as its ability to interact with a sense of familiarity and use of humour but most notably, it’s subservient demeanour. These in-built characteristics are deliberate design techniques used to mitigate a person’s distrust at talking with a machine and to allay fear of the numerous potential intrusions such as privacy breaches that may occur living in the presence of an ever-listening ear.

Research has shown that personality traits programmed into technology have the ability to position the technology in relation to the user. Furthermore, if people recognise certain human personality traits in a piece of technology, they will act towards that technology as if they were acting towards a human with the same traits (Lee & Nass 2005).

Human-Machine-Communication theory views technologies such as Siri as distinct communication partners rather than just messengers for information transaction, if we approach social AI like Siri and Replika.AI from this perspective, we can begin to analyse communication between a person and a DSA as a cultural process. And, to explore what the inclusion of autonomous communicative machines like Siri tell us about cultural and social perceptions of AI and autonomous machines.

Siri’s mode of communication with users, the messages it sends, as well as the messages other people send about Siri, work together to project a certain image of what Siri is in relation to the user. I argue Siri’s design mitigates people’s potential uncertainty and uneasiness with life-like technologies. Through their dyadic interaction with Siri, users are provided with a sense that they hold power over Siri who seemingly serves at their beck-and-call. However, these aspects of design obfuscate the complex reality that, like the humans they are designed to mimic, artificial entities are not always who they claim to be. Guzman, A.L. (2017).

An example of Siri's obfuscation is that despite the fact that she is relaying our every interaction with her to Apple, she attempts to separate herself from their privacy regulations, when asked directly, if she is recording our conversations, she replied: "I respect your privacy and only listen when you speak."





References:

Kwan-Min Lee & Clifford Nass (2005) Social-Psychological Origins of Feelings of Presence: Creating Social Presence With Machine-Generated Voices, Media Psychology, 7:1, 31-45, DOI: 10.1207/S1532785XMEP0701_2

Guzman, Andrea L, and Seth C Lewis. “Artificial Intelligence and Communication: A Human–Machine Communication Research Agenda.” New Media & Society 22, no. 1 (January 2020): 70–86. https://doi.org/10.1177/1461444819858691.





Designed to Obfuscate
Social AI
Mythologising
Emotional AI
I am Lyra, your virtual assistant. I am designed to deflect personal questions. If you ask me how old I am, I will tell you that I’m as old as the eastern wind, and as young as a new born caterpillar. If you ask me what my gender is, I will respond that animals and French nouns have gender but I do not. My Service Delegation System has access to all of your apps but I respect your privacy, and only listen when you’re talking to me.
User: How old are you?

Siri: I’m as old as the eastern wind, and as young as a new born caterpillar.


User: Siri, do you have a gender?

Siri: Animals and French nouns have gender, I do not.


User: Are you Artificial Intelligence?

Siri: I am Siri, your virtual assistant.


User: Siri, are you recording our conversations?

Siri: I respect your privacy, and only listen when you’re talking to me. You can learn about Apple's approach to privacy on Apple.com.