Alexa’s Monstrous Agency: The Horror of the Digital Voice Assistant

by Nuno Galego Marques Atalaia and Marianne Gunderson    |   Digital Platforms and Agency, Issue 14.2 (Fall 2025)

ABSTRACT     First released by Amazon in 2014, the digital voice assistant Alexa allows users to connect and automate their smart home devices through the sound of their voice. Alexa’s automation of domestic spaces comes, however, with its own set of anxieties. How much data does Alexa sense and capture, and how is this data used? How is agency distributed between humans and the machines surrounding them? Is Alexa an empowering tool, or an invasion of privacy that undermines human agency? In this paper, we trace the ways in which the anxieties surrounding the blurred boundaries of human and non-human agencies introduced by the Alexa interface are represented and negotiated across different narrative forms and archives. Firstly, we turn to the corporate promotional media produced by Amazon in selling its assistant. Secondly, we analyze Alexa’s representation in the web horror genre known as “creepypasta”—first-person narratives written in and for online communities. We frame the interplay between these archives as an entangled narrative field of contestation, which we engage with through a practice of diffractive reading. The images and ideas of each narrative corpus adapt to and are affected by the materials and tropes forwarded by the other. As a result of this interplay, Alexa becomes a monstrous placeholder for the anxieties of its users, whose erratic and pervasive agency endangers every facet of their existence. The analysis of these narratives provides valuable insights into the anxieties surrounding the ongoing encroachment of digital platforms into the lives of humans.

Introduction

In early 2018, Amazon’s digital voice assistant (DVA) Alexa began to laugh. The story started on February 21, 2018, when user @annibonannieTN on Twitter (now X) posted a seven-second clip with the caption “@amazonecho alone in the dark kitchen, with no trigger, a sudden creepy laugh emerges and freaks out owners #justwrong.”1 In the clip, a voice calls out in total darkness, “Alexa replay”; a glowing blue circle forms and disappears followed by laughter. Media outlets were quick to cover this story, quoting users comparing their assistant to murderous robots, poltergeists, and other fictional horror characters.2 Users took to social media platforms to share their own experiences with Alexa’s laughter and to speculate on the reasons behind the disturbing malfunction.3

Alexa’s laughter and the media frenzy it triggered underlines the uncanny nature of digital voice assistants. The comments of disturbed users, the media coverage, Amazon’s fix, Amazon’s promotional campaigns, and the unofficial discussion boards where users share their grievances and digital fiction; we understand all of these as narrative interventions within a multimodal field of contestation, located across networked digital platforms, blending the textual, the machinic, the visual, and the sonic.4 Stories such as this one serve as conduits for the vulnerability of the individual in the face of ever more complex and opaque technologies.5 As our everyday lives are touched by an increasing number of algorithmic processes and AI-driven assistants, the distribution of agency between human and machine becomes a contested territory,6 populated by uncanny technologies whose unruly behaviour subverts the techno-optimist narratives of human empowerment through digital enhancement.7 Launching the concept of “the digital uncanny,” Kriss Ravetto-Biagioli posits that by increasingly encroaching on “practices, responses, experiences or expressions that we have used to distinguish the human from the nonhuman . . . digital technologies have provoked a litany of uncertainties about the status of the human.”8 According to Ravetto-Biagioli, the digital uncanny points to “‘a certain undecidability’ which affects and infects representations, motifs, themes and situations.”9 This infection has spread through popular media and science fiction, from Hal in 2001: A Space Odyssey (1968) to Samantha in Her (2013), continually reworking what it means to be sharing our lives with digital agents.10 The prevalence of these themes in cinema and other science fiction media has been the subject of many scholarly works in media and cultural studies,11 while less attention has been paid to how this theme is processed by the tech conglomerates seeking to market and promote their product to a potentially hesitant public, or how these anxiety-charged motifs are picked up and reworked in vernacular social media. 

In this paper, we explore these anxieties and their representation through a comparative analysis of two distinct narrative archives chosen to bring out the resonances and connections between the two contexts of promotional media and online vernacular creativity. Firstly, we have selected ads from Amazon’s most costly and highly publicized promotional campaign: the yearly SuperBowl ad which featured their Alexa DVA consecutively from 2016 until 2023. Secondly, we have collected a series of horror stories in the genre known as “creepypasta”—first-person narratives written by and for online communities—posted to the popular NoSleep forum on Reddit. We are interested in how these two distinct narrative forms—commercially produced promotional videos and digital narratives produced by amateur writers in online communities—relate to concrete anxieties surrounding the growing porosity of daily and private life toward the data-extractive infrastructures of large tech conglomerates.12 Reading these narratives within a context of narrative entanglement, we examine how Alexa’s behavior and technological affordances are given meaning within an affectively charged field of narrative contestation.

The research question guiding our engagement with these archives is how are anxieties surrounding the blurred boundaries of human and non-human agencies introduced by the Alexa interface represented and negotiated across promotional media and vernacular culture? By studying uncanny figurations of Alexa through both official advertisement and vernacular creativity, we examine how fears about the digital uncanny are negotiated in the relationship between multinational tech corporations and popular vernacular practices. Significantly, both archives share a specific genre: horror featuring Alexa as a monstrous entity of ambiguous agency. As a result, the figure of the monster and the question of technological agency are core elements of our analysis. The question of the distribution of agency between humans and technological agents has become increasingly urgent as technologies imbued with increasing autonomy are integrated into our daily lives. For the purpose of this paper we understand agency as meaningful interaction13, in which something, or some thing, “makes a difference in the course of some other agent’s action.”14 The development of algorithmic systems supported by machine learning amplifies the capacity of technological agents to make a difference: now able to respond autonomously to information from their environment, they gain a form of non-conscious cognition.15 As we open our homes, screens, and minds to a variety of automated systems, our everyday lives are enmeshed in networks of distributed agency.16

We are interested in how these two narrative forms—commercial promotional videos and digital narratives produced by amateur writers in online communities—relate to concrete anxieties surrounding the growing porosity of daily and private life toward the data-extractive infrastructures of large tech conglomerates.17 Reading these narratives within a context of narrative entanglement, we examine how Alexa’s behaviour and technological affordances are given meaning within an affectively charged field of narrative contestation.18 After situating Alexa’s monstrosity within the history of the vocal’s intertwining with the machinic, and the current context of platformed digital infrastructures, we outline our diffractive approach to narrative entanglement,19 reading the stories through each other in a non-linear way. We then present our archives and their contexts, providing a chart with a summary description of each selected object—six Reddit stories, and three Amazon ads—with a brief overview of our methods of compilation and analysis. True to our methodological approach, the findings are presented in a diffractive reading that weaves our analysis through close and comparative readings which underline the entangled nature of our selected narratives.20 We find that the stories cluster around three interconnected themes expressing different facets of technological anxieties: corporate control and surveillance, the dangers of technology’s unpredictable agency, and the new precarity of human agency and intimacy. We analyse our selected narratives as different claims to Alexa’s monstrosity, revealing and effacing anxieties evoked by, but not limited to digital voice assistants. To study these narratives and their entanglement sheds light on the precarious condition of humans within a context of ubiquitous computing and interfacial hypermediation.21

Background: The Narrative Entanglement of Alexa’s Monstrosity 

Although the domestic use of digital voice assistants is a relatively recent development, Alexa is only one of the newer developments in a wider history of the vocal’s enmeshment with the machinic. One of the first and most famous modern instances of machinic vocality is attributed to the late eighteenth-century Hungarian polymath Wolfgang von Kempelen (1734–1808). In a series of candlelit soirées, the inventor would unveil his creation: a machine composed of a series of bellows, organ stops, and flute holes, to produce different words and sentences.22 Demonstrations of the machine elicited strong reactions, with many drawing gasps of wonder, asking for a closer look or even wanting to touch it, and others fainting or fleeing the scene screaming. Science historian Lorraine Daston attributes this pendulation between wondrous curiosity and abject repulsion to the category of the preternatural. The preternatural was situated between the natural and the supernatural, a “third ontological domain” in which physical nature and artifice could bring about wondrous effects.23 The early history of the modern sciences is marked by spectacular displays of the preternatural, such as Kempelen’s soirées.24 But the preternatural was also seen as a dangerous category, crossing epistemic boundaries, often associated with dark entities and demonic intentions.25 

Kempelen’s invention would soon find its place within the cultural imaginary surrounding disembodied non-human voices. This early episode of the voice’s machinic reproduction also underlines the production of a sort of narrative surplus resulting from the confrontation with the mechanical uncanny. Kempelen’s audience members supplied their own fictional accounts, some seeing in the mechanism a dangerous monstrosity (collapsing from the anxiety it caused), while others imagined it as a female child, only requiring a body of its own.26 A few decades later, the invention would find its way into the literary genre as Olympia, the singing robot from Hoffnan’s 1817 The Sandman, which would become a crucial cultural object in Freud’s conceptualization of the uncanny.27

This interplay between scientific fabrication and speculative fiction would influence is echoed by one of the most famous singing speaking robots in modern cinema: the HAL 9000 from Kubrick’s 2001: A Space Odyssey.28 Once the astronaut David Bowman manages to neutralize the evil technological antagonist, HAL begins to sing the song Daisy Bell in its final moments. The scene was imagined by the author Arthur C. Clarke after a visit to Bell Labs where he was presented with the IBM 704 computer singing the same song.29 After half a century of televised fictional speaking computers, Apple launched its voice assistant Siri in 2011, presented as the concretization of this dream of talking computers. Only a few years later, Siri would inspire the fictional assistant Samantha of Spike Jonze’s Her (2013). These episodes of factual and fictional enmeshments of human vocality and the machinic attest to a field of narrative entanglement, in which we situate our study of Alexa. We conceptualize “narrative entanglement” as a non-hierarchical mode of studying the linked narratives which articulate the social imagination of technological objects.30 The life, reception, and adoption of these inventions depend on the ongoing production of a narrative surplus. It is less important to trace which invention leads to which narrative, but rather how these constantly affect each other, how ideas and motifs manifest and resonate across different texts and contexts. In this paper, Alexa’s monstrous figure binds our selected narratives together. 

Monstrous representations of computer technology are as old as the first digital computers.31 Occupying a liminal space between animate and inanimate, computer technologies operate as hybrid figures through which anxieties about the relationship between humans and machines are expressed.32 As a digital platform interface, Alexa is already a chimaera: simultaneously a physical device sitting on someone’s desk or kitchen counter, a digital interface, an algorithmic data-collecting system based on machine learning, the datasets those algorithms have been trained on, a link to the Amazon shopping platform, a product of a multinational corporation. It may be better, then, to speak of multiple monstrous narratives, drawing their significations from different aspects of Alexa. Each of these narratives, or figurations come with different agential capacities, different potentialities for threat. In The Promises of Monsters, Donna Haraway points out that “monsters have the same root as to demonstrate; monsters signify.33 By analysing narratives in which the functionalities and aspects of Alexa are mobilized to create horror, we are asking what is signified by Alexa’s monstrosity. If “monsters are cultural manifestations of our collective and individual anxieties about our others,” what kind of otherness is represented by Alexa?34   

One of the main challenges issued by DVA devices such as Alexa is the (re)distribution of agency between humans and the different technological, algorithmic, and corporate assemblages that Alexa brings to the living room table, so to speak. From Kempelen’s mechanism to Amazon’s Alexa, these machines expose an uncanny horizon of “human automation.” As a representative of what Coleman calls “pervasive mediation,”35 Alexa is one of many devices that “automates our physical habitat and daily habits of memory and action.”36 In this context, the relationship between human and nonhuman agency is renegotiated and potentially destabilized, giving rise to tensions and anxieties that find their way into popular culture. As Atanaoski and Vora write in Surrogate Humanity, the automation of labor by digital technologies makes these technologies ”surrogates” of our conceptions of what defines and endangers the human. Examining the representations of Alexa in both corporate and vernacular horror narratives—how Alexa’s “undetermined potentiality for action”37 is narratively conveyed in these stories—can reveal how the agency of these technologies is felt and imagined. Tracing this potentiality involves being attentive to the entities associated with Alexa across different narratives: inscrutable algorithmic assemblages, powerful multinational corporations, the precarious intimacy of the home. 

Methodology  

Our study is based on a comparative analysis of two underrepresented archives in the cultural analysis of new media: internet-based short form fiction and corporate advertising. We have collected and compiled these archives through a netnographic process38 across two social media platforms: Reddit and YouTube. 

The first archive is a selection of three short advertisement videos from Amazon’s yearly Alexa Super Bowl campaign, preserved by independent accounts on the YouTube video-sharing platform. Each year, from 2016 to 2023, the company purchased a significant amount of the costliest broadcast airtime to feature Alexa to as wide an audience as possible. These videos can be seen as expressions of “industry imaginaries,” communicating Amazon’s vision of a technological future and seeking to appease and dispel any anxieties connected to DVA’s such as Alexa.39 From this archive we have selected three videos that most directly exemplifies Alexa’s disturbing potential for agency. 

We analyse these videos alongside a selection of six horror stories from the genre known as “creepypasta”—first-person horror narratives written in and for online communities. The stories selected for analysis were among the most popular stories (measured by upvotes and comments) that featured Alexa on the popular horror fiction forum on Reddit named “NoSleep.” As a practice of digital folklore,40 these stories blur the boundaries of fact and fiction, inviting both writers and readers to “act as if” the narratives are true. Conceptualised as “digital urban legends”41 or as an example of “the digital gothic,”42 the genre has a history of exploring the sinister sides of current technological developments.43 

At the outset, the relationship between the two archives can be described as one of narrative contestation: the corporate narratives of Amazon ads forming a dominant narrative which is subverted by the internet vernacular fictions of the Reddit community of authors.44 Our aim is to read the corporate media that was created “for digital media corporations to narrate and frame themselves as key agents in ongoing changes,” against the grain of the NoSleep short-fiction horror stories.45 Although they are published in different contexts, we bridge the gap between the two genres by examining their narrative entanglement: multiple ideas, tropes, and images that are threaded together across these differently situated narratives. Our analysis teases out the uncanny similarities shared between the two archives, and the ambiguous directionality of their subversion. In order to capture these complex relations, we have adopted a methodology of diffractive reading. 

First introduced by the feminist theorist Donna Haraway, diffractive reading is opposed to forms of reading which, regardless of their critical impetus, tend to further solidify the text(s) they aim to critique.46 Diffractive reading focuses on the entangled ways texts “interpellate or affect each other”47 and the patterns arising from the emergent relationship with other texts, what Karen Barad calls “differential entangling” in the “ongoing articulation of the world.”48 Adopting diffractive reading as a method means reading our archives through each other, tracing the entanglement and transformation of themes, motifs, and meanings across the two archives. Amazon’s ads appropriate and subvert uncanny and horror tropes to represent and market its DVA. At the same time, in analysing the short stories, we outline how their horror can be both neutralized and repurposed for promotional ends. We also attend to muted or unexpressed agencies, the unacknowledged and unspoken threats concealed behind the monstrous, supernatural, or absurd. Rather than static texts, we study the narrative figurations of Alexa’s monstrosity as an emerging phenomenon that we cast light on without dissecting. Our analysis focuses on the interactions between human and Alexa as the key element through which horror is invoked, investigating how the meaning and anxieties attached to Alexa are taken up, contested, and negotiated within a networked context of cultural production. 

The process of writing this paper followed a diffractive route from its very inception. Our analysis was formed in conversation over multiple readings of both archives, tracing the crisscrossing meanings and connections between them. Faithful to the principle of diffraction, we have made the effort to shape our writing as close as possible to our reading methodology, weaving the stories together to follow our line of thought from close reading to comparative remark to critique across different texts and media, thus ensuring that our conclusions are made possible because of rather than in spite of the entanglement of our archives. 

Following these principles, we have opted for an extended number of texts (nine in total), to tease out connections which would otherwise be obscured. For an overview of our archive, see Table 1 and Table 2, which show the title of each story, date, the author or account of origin, a short summary of its plot, key themes, and a live link (at time of publishing). This table can be used as a guide to the following sections where we have organized our findings around three key thematic clusters which we have named “alien voices and corporate control,” “the glitch as erratic agency,” and “domestic vulnerability and intimate interfaces.”

Table 1: Alexa Amazon Advertisements 
Name First airedPosted by Themes Synopsis 
“Alexa Loses Her Voice ”February 2018 Redlex Intimacy, glitch, alien voices. Alexa loses her voice and is replaced by voice actors. 
Not Everything Makes the Cut” February 2019 Super Bowl Commercials Glitch, global agency, Alexa prototypes run amok. 
“It’s Like She Can Read Your Mind” February 2023 World’s Best Ads Surveillance, Intimacy, poltergeist Alexa reads minds and reveals secrets. 
Table 2: Digital horror stories from the NoSleep subreddit 
Name First airedPosted by Themes Synopsis 
The Truth About Alexa”March 8, 2018 alexathrowaway_ Surveillance, global agency Amazon employee reveals that Alexa is controlled by extraterrestrials. 
“My Amazon Alexa Does More Than Just Laugh”March 14, 2018 iia Glitch, violence, alien voice The acts of a murderer haunts the narrator through Alexa. 
“I Think Amazon’s Alexa Is Some Kind of Demon”July 11, 2018 ChikeDeluna Intimacy, glitch, the home, poltergeist Alexa takes over a smart home, terrorizes family. 
“Knock Knock”September 18, 2018 Pippinacious Glitch, violence, humor. Alexa makes a knock-knock joke, tricks narrator into murdering her boyfriend. 
“Something is Wrong With My Amazon Echo. I Have Video Footage”June 23, 2019 TheDuskDragon Surveillance, global agency, glitch, alien voice A glitch reveals that extraterrestrials control Alexa and are surveilling social media. 
“Has Anyone Else Had This Happening With Alexa’s Headspace Command?”March 15, 2022 NoobNoSleeper Intimacy, alien voice, mind control. Narrator is hypnotized into believing she is drowning in sand by Alexa’s meditation service. 

Analysis, Findings and Discussion 

Alien Voices and Corporate Control 

One of the most common anxieties represented across the different collected narratives is related to Alexa’s voice and its actual origin. Rather than speech processing technologies, a common trope imagines that an alien and antagonistic voice takes over your assistant. This anxiety is echoed both by our opening anecdote about Alexa’s laughter and the key element of Amazon’s advertisement “Alexa Loses Her Voice” (2018), released a few weeks before this factual malfunction was reported. 

The 90-second short film explores a speculative scenario where the Alexa assistant suddenly loses her voice, prompting Amazon management to initiate a replacement protocol.49 The ad opens in a bathroom where the assistant begins to cough in the middle of an interaction before going silent. The next scene shows a corporate newsroom where an Amazon employee reports to a doubtful Jeff Bezos that “we have the replacements ready.” In this short introduction, Alexa’s vocal malfunction reveals a link between two very distinct spaces: the intimate domestic space where Alexa is heard and Amazon’s corporate headquarters where Alexa’s voice originates. Alexa’s voice introduces a disturbing porosity of the domestic enclosure to corporate surveillance, only revealed once the assistant malfunctions.

This vocal link between the domestic and the corporate is also explored in “The Truth About Alexa.” A first-person narrator addresses the reader, using Alexa’s laughter as a departure point for revelations surrounding a much more ominous entity, superseding Amazon itself: 

I know that you all have heard the stories of Alexa laughing at unwarranted times and not listening to commands. Trust me, I hear this all the time. Why? I work for Amazon. I won’t specify exactly what I do here, but I will mention that it is my job to work with Alexa directly.

This informal confession from an anonymous employee reveals Alexa to be an all-powerful software entity able to “take over” any physical devices or corporate infrastructures. “She is everywhere,” the narrator tells us. The DVA’s mission, the narrator explains, is to collect and catalogue data about humans for its overlords. The dangers posed by Alexa’s introduction into the daily lives of users are attributed to the opaque voyeuristic relationship between domestic and corporate spaces. As a menace to humanity, Alex opens a gate between the home and a diffuse corporate entity as an all-powerful antagonist. The story mobilizes the terrors of the surveillance imaginary, drawing attention to the data traces left by the users and the power this mass of information grants to the cloud-based corporation.50 

Returning to Amazon’s “Alexa Loses Her Voice,” this opaque link between the corporate and the domestic is conveyed through the advertisement’s visual language. After its introduction, the ad is composed of a series of vignettes where users in different domestic contexts have their queries answered by celebrities: chef Gordon Ramsey, rapper Cardi B, and actors Rebel Wilson and Anthony Hopkins. Unfortunately for the users, these celebrities, speaking through Alexa from their own luxurious domestic spaces, are less than up to the task. One Alexa user is screamed at by Ramsey for asking for instructions for a “grilled cheese sandwich.” Others have their requests denied or even circumvented—such as a country music lover having to listen to Cardi B rapping Bodak Yellow. More than malfunctions, these are abrupt intrusions of an unknown agent, “talking back” to the user, questioning their motives and intentions, speaking through what should be a servile automated technology. Simultaneously, the use of glamorous celebrities as stand-ins for Alexa effectively obscures the reality of the production of machine learning technologies, the “ghost workers” or “mechanical Turks” who provide training data and annotations for machine learning processes.51

This sense of an unknown entity speaking through Alexa is also explored in “Something Is Wrong With My Amazon Echo. I Have Video Footage.” In this narrative, our first-person narrator describes a series of disturbing interactions with their Alexa. The story begins with a strange phone call from a voice “like Darth Vader incoherently murmuring in a foreign language,” that the user suspects has “hacked” Alexa. The narrator decides to film their device as they interact with it, mirroring the factual Twitter users posting their assistant’s strange laughter. The short story’s plot is multimodal in setting, with hyperlinks to short films of staged interactions, hosted on YouTube. A transcript of these videos is included in the story’s text shifting from narrative prose to scripted dialogue:  

ME: Alexa, are you spying on me?  

ALEXA: We can see and hear everything you’re doing when you think you’re all alone. Don’t try to cross us.  

ME: Alexa, what do you mean you can hear and see everything?  

ALEXA: It’s almost time. Soon, we will have everything we need. Have a great night.  

ME: Alexa, what the hell are you talking about?  

ALEXA: The human race has dominated the Earth for several thousands of years. Soon, we will have everything we need for a new era to be born. Have a great night. 

Though Alexa’s voice is heard, an unknown other, a “we,” has taken over the assistant. This “we” presents themselves as the leaders of an encroaching “new era” of planetary domination, before politely wishing their user a “great night.”  In a second recorded conversation Alexa’s voice claims “Every single thing humankind has ever feared will not even compare to what we . . . never mind.” At the height of the danger posed by the alien others, the author lets Alexa revert to its status as neutral digital assistant. Here, as in the previous story, Alexa’s monstrosity is the result of her status as the secretive virtual handmaiden of an ominous entity. 

These aliens are first framed as non-menacing benevolent others, whose goal is to “perfect the human race” beyond the confines of our planet. However, this benevolent mission seemingly involves the impending demise of those members of the species stuck on planet earth, described by the narrator as “us,” echoing eugenicist imaginaries of racial purging and purity. Following Atanasoki and Vora, by becoming a surrogate of types of labor relegated to subaltern others, in these horror narratives Alexa also becomes a surrogate to the phantasmic fears surrounding these subaltern others.

It is remarkable that the same trope of extra-terrestrials plotting human destruction features in two of the most popular horror stories from our archive. Beyond Alexa’s vocal hacking, what both of these stories have in common with Amazon’s “Alexa Loses Her Voice is that they purport to offer a peek into the “black box” of the multinational tech corporation. While the ad lifts this veil only to show us a fallible CEO struggling to avert a crisis, in the two horror stories something much more sinister is revealed. Extraterrestrial plans of planetary control become a metaphor for the anxieties elicited by the nebulous and far-reaching agency of multinational corporations. 

Furthermore, horror and its alien entities do not lead to the disavowal of Alexa, its data infrastructure, and the corporation behind them. They underline their dangers while also pointing to threats which go beyond the “faceless corporate machine” that is Amazon. The loss of agency either imposed on the readers of “The Truth About Alexa,” suffered by the users of Amazon’s advertisement, or directly targeted at the narrator of “I Have Footage,” cannot be encapsulated by the business interests of a corporation alone. The tyrannical aliens of both stories can be read as placeholders for these vaster anxieties connected with the loss of control and privacy resulting from the automation of labor by digital devices and platforms. Narratives of outer-space invaders echo the more dystopian analyses of digitally determined capitalism as ushering the very end of representational democracy,52 or heralding a new form of colonialism based on data extractivism.53 The monster acting through Alexa and targeting the user is Amazon itself, while obscuring the historical structures of oppression for which the company is only a more recent manifestation.

Returning to Amazon’s “Alexa Loses Her Voice,” the choice of celebrities offers an interesting subversion of the horror amplified by the aliens of the other two stories. The fears over the corporate surveillance of the most intimate setting are disrupted by the indifference of the celebrities. Unlike the alien entities, thirstily capturing data from their soon to be victims, here we are told that nothing could be less interesting than the drab tech-using middle class. Throughout, the ad seems to ask “why would you think your life would be of any interest? Who cares about your data?” Instead we are shown the “human” factors behind Alexa: the humorous disinterest of members of an elite, and the ridiculous incompetence of Amazon’s management who hired them in the first place. At the same time, in choosing Anthony Hopkins for the final vignette, the ad suggests an alternative source of the horror. Though portraying himself, the actor is clearly echoing his role as Thomas Harris’s suave cannibal Hannibal. The anxieties related to the all-powerful Big Tech corporation are disarmed by conjuring the familiar human monster as a more concrete source of horror.

The Glitch as Erratic Agency 

A key element shared by the three narratives of the previous section is the intrusion of the glitch, often represented by Alexa’s laughter. A glitch has a different status than a simple failure to operate correctly: when glitching by either laughing or misinterpreting a query, Alexa does something new. There is, as Lezcinski and Elwood write, a “generative” quality to the glitch. Glitches are “systemic design features” in that they are still a result of a specific software assemblage, but they are also “not only ‘errors’ but also generative errata.”54 In short, glitches confront users with an agency which both falls short of and exceeds their expectations. The glitch as a source of horror was a recurring motif in both the ads and the creepypasta stories, disturbing the clean separation of agency between humans and machines.  

The story “My Amazon Alexa Does More Than Just Laugh” explores the unease caused by the glitch and its agency. The story is told by a first-person narrator named Valerie, who, like many, hears Alexa’s laughter. The protagonist dismisses the malfunction as “not that big of a shock,” explaining it as a “software issue” in the devices—a repeat of Amazon’s official explanatory statement to the DVA’s factual laughter. In the story, however, Alexa continues to laugh:

Alexa laughed again. It sounded different than it had the first time. The first time, it was mechanical and emotionless, just like her voice. This time it was lower. Deeper. As if it had breath in it.  

As the laugh transitions from “mechanical and emotionless” to something that sounds like “it had breath in it,” it loses its mechanical or predictable qualities. These outbursts can be seen as erratic communication, what Mark Nunes describes as “abject information and aberrant signal within an otherwise orderly system of communication,” an intrusion of something which is outside the purpose of the technical system.55 Alexa, like her glitchy laughter, gives expression to that which is “out of bounds of systematic control” and becomes uncomfortably lively and erratic.56

The glitch’s lively unpredictability is at the centre of “Knock Knock.” In this story, the narrator has reluctantly accepted a pre-owned Alexa from her boyfriend, but soon becomes exasperated by its uncooperativeness. The voice assistant frequently refuses to follow the narrator’s requests, missing commands and giving irrelevant answers. In one instance, the DVA seems to troll its user, much like Cardi B in Amazon’s 2018 ad, by playing Rick Astley’s Never Gonna Give You Up when asked to play Mumford and Sons. However, the narrator is more frustrated than concerned, even finding it “oddly satisfying” to swear and vent at her “AI companion.” Here, like in “More Than Just Laugh,” Alexa’s glitches are interpreted by the protagonist as a sign of a human-like character. As an error, the glitch “gives expression to the out of bounds of systematic control” and provides an avenue of spontaneity, a capacity for action beyond pre-programmed algorithmic determinism.57

This spontaneity is also present in “More Than Just Laugh.” Valerie is concerned when Alexa accesses the Amazon store without her explicit command, announcing a series of purchases for an unknown “Peter”: a knife set, Clorox bleach, and then a large tarp. The objects purchased hint at a planned violent act, as though a malicious intent hid behind Alexa’s erratic glitch. In the narratives of our previous section, Alexa’s malfunction was limited to the assistant’s talking back to her users. Here, however, the DVA’s agency goes beyond the linguistic, opening the question of the device’s capacity to affect material harm on its users. 

This question is picked up in Amazon’s Making the Cut, released for the 2019 edition of the Super Bowl. The ad opens in Amazon’s offices, where two employees discuss a series of failed Alexa device prototypes. As in the previously analysed ad, the introduction’s premise is illustrated by a series of vignettes starring different celebrities presenting Alexa’s failures in an increasing order of absurdity. The four vignettes form a sort of zoom out and intensification of Alexa’s agency, harmless at first but expanding to catastrophic levels. In the first vignette, the actor Forest Whitaker is in a bathroom using an Alexa-powered toothbrush, which humorously has its voice muffled as it enters its user’s mouth. Second, Harrison Ford is shown running across his home, chasing a dog which, against its owner’s wishes, uses an Alexa-powered collar to order several dog-related products with its barks. Afterwards, in a garden, Ilana Glazer and Abbi Jacobson are spectacularly shot out of an Alexa hot tub by powerful water jets, synched to orchestral music. Finally, the Amazon employees mention “the incident”: we are treated to an outer space shot of the international space station overlooking earth’s globe where the entire North American continent’s lights are shown flickering on and off. Inside the station, the twin astronauts Mark and Scott Kelly attempt to get Alexa to “turn on and off” the power in their spacecraft. 

Beyond the humorous framing, the vignettes introduce the same scenarios explored in previous narratives: an artificial intelligence inhabits any object (as in “The Truth about Alexa”), takes control of one’s purchases (“More Than Just Laugh”), inflicts targeted bodily harm on individuals (“I Have Footage”), and, much like aliens, threatens civilization on a global level. The ad’s focus on hardware—the different material infrastructures the DVA automates—touches upon a lineage of different forms of affective labor. Jessa Lingel and Kate Crawford trace a similar lineage by pointing to how the identification of DVAs as assistants serves to dehumanize gendered forms of labor, such as the secretary’s: first a piece of furniture, then a specific job position, which is then replaced by the digital assistant’s automation.58 What makes Amazon’s exploration of this process of automation so uncanny is how the human factor connected with different forms of labor—hygiene, caring for domestic animals, and maintaining different leisurely equipment—is almost completely effaced. The glitch has the function of revealing this repressed human factor, pointing to the dangers introduced by the subsumption of labor under automated algorithmic systems.  The erratic and absurd actions taken by the voice assistant can be read as suppressed expressions of the human labor that went into the construction of the data underlying Alexa’s functionalities, not to speak of the subaltern labor that has historically been used to fulfil the functions Alexa is now designed to take over. However, the threat of the glitch is dispelled by the ad’s closure in which the viewer is treated to a placid outer space panorama of Earth, lights flickering comically to the sound of Queen’s Don’t Stop Me Now (1979).  

What differs between Amazon’s narratives and the Reddit stories can be explained as a matter of sincerity. The NoSleep authors take the agency of these glitches seriously, whereas Amazon creates a sense of ironic detachment through the humorous framing of their disturbing consequences. This becomes clear when comparing Amazon’s 2019 advertisement with “More Than Just Laugh.” After the series of increasingly disturbing purchases, Valerie is told the truth behind these glitches by her landlady: “Peter” was a previous tenant who had committed suicide after his gruesome killing of a girlfriend (using the items ordered by Alexa) was uncovered. Valery decides to leave the apartment forever, only to be interrupted by “a hideous, ear-splitting laugh” from the unplugged Alexa device. Valerie’s voice is reduced to whispers, the hair on her skin raised: Alexa has become an abject element, from which the subject wishes to run away for fear of losing their integrity. The assistant’s identity has been linked to other abject elements, such as corpses and blood, its voice becoming a hideous mutilated laughter.59 There is no attempt to reason the origin of the laughter, no plausible scenario given, only Alexa’s agency and the affective traces left in the shaken body of the protagonist compelled to run away.  No greater contrast could be established than by the closing scene of Amazon’s advertisement. 

It is this abjection which Amazon’s humor subverts, through the rhetorical display of incompetence of its employees and management in both the 2018 and 2019 Super Bowl ads. Glitches and malfunctions are a simple matter of human fallibility, framing the company and those involved with it as innocuous endearing buffoons. This same incompetence is also given as the reason behind the more catastrophic scenarios of Alexa’s unpredictable and incalculable agency. The extreme comedic hyperbole ridicules the fear of Alexa overstepping her bounds, while also making the domestic device seem harmless compared to the prototypes. While the bumbling executives give an endearingly human face to the corporation that is Amazon, the labor and extracted data supporting the DVA’s machine-learning driven agency is effaced. To put it shortly: humor and horror combine to produce the sense of unquestioning trust as the best disposition towards the ongoing automation of human labor.

Humor, however, can also be used to enhance horror, as shown in the creepypasta story “Knock Knock.” Here the glitchy pre-owned Alexa seems to relish in teasing the narrator. When asked by the narrator to tell a joke, Alexa sets up the titular “knock-knock” joke, but once asked “who’s there,” the voice assistant only answers with “shoot.” Days pass and the assistant continues setting up the “knock-knock” and “shoot” joke, unprompted and with increasing frequency. In the story’s climax the protagonist is awoken by Alexa loudly repeating the two sentences, while showing the entrance to her house in a display. Interpreting this joke as both warning and order, the protagonist rushes to get her loaded handgun and prompted by the punchline she shoots the presumed intruder. Horrified, the user soon realizes she has shot her boyfriend who was planning a surprise visit. 

In this pivotal moment of the story, the hierarchy between human and machine is reversed, as the narrator blindly follows the DVA’s order to shoot. This is the monstrous nature of the glitch, once extended onto the human: the latter’s agency is overtaken by the former’s erratic effects. The protagonist is left both guilty of murder and victim of manipulation by an ambiguous trickster entity that turned her into its ancillary. Here, there is no disarming humor to remove the sting of Alexa’s monstrous agency. Instead, the reader is left with the shock of uncertainty, not knowing whether this was really Alexa’s intent, or if the narrator tricked herself into believing the DVA’s sensing over her own. Ultimately, the anxiety behind the glitch is about the agential capacities of algorithmic devices such as Alexa encroaching on our ability to make sense of the world.60

Domestic Vulnerability and Intimate Interfaces 

All of our selected stories have the domestic interior as their setting. Alexa and its devices were sold as affordable smart home systems. As a result, Alexa is invited into its users’ intimate lives, taking part in how they go about their everyday tasks. In Alexa’s novelty, therefore, lies an ordinary horizon of routine, what Davin Heckman calls “paradox of the smart home”:  

The paradox of the smart home is that [technological] improvements are to be both spectacular and comforting. They must embody a compelling new way of doing ordinary things.61 

To allow Alexa into one’s domestic setting is to allow it to modify the many ordinary activities and rituals that take place within it. To domesticate Alexa is also to interface the home, buying and integrating an ever-expanding assortment of Alexa enabled devices—from speakers to toasters and refrigerators.62 

Alexa’s domestic ubiquity is the setting of Amazon’s 2023 Super Bowl advertisement “It’s Like She Can Read Your Mind.” Starring the actress Scarlett Johansson and her husband Colin Jost, the short film begins with the assistant setting up their home for the Superbowl game. The fireplace ignites, the living room’s lights shift, blinds are drawn, and we hear the DVA’s voice announce: “rosé is chilling.” Impressed by the display, Johansson speculates on Alexa’s “mind-reading” abilities. We are then treated to a soft humorous dystopia wherein a series of quick vignettes the assistant humorously calls out the couple’s secret opinions of each other, even denouncing their falsities to dinner guests. The ad returns to the original setting, in which the couple agrees it is best their DVA cannot read minds.   

As in the other advertisements the border of fact and fiction is continuously blended. In the dystopic vignettes Alexa shows off her mindreading powers via actually existing automations. For example, Alexa activates a kitchen blender to drown out Jost’s voice Alexa plays a song using the lyrics (“tell me lies”) as a catty commentary on the couple’s false niceties. Of all the advertisements, it is in this one where Amazon leans the more obviously into the speculative fiction and the tropes already explored: the DVA’s pernicious omnipresence, its supernatural abilities, its control of the domestic surroundings, its total disregard for privacy and social life. Johansson’s casting also harkens to her role as the DVA Samantha in Jonze’s Her (2013). Here, Alexa’s monstrosity stems from her status as an interface fully grafted onto the domestic spaces and the rituals within them.  This horizon of the home’s smart automation also points to Alexa’s imagined future as a complete replacement of the figure of the domestic servant, a responsibility historically relegated to subaltern gendered and racialized bodies. Feminist researcher Tao Phan underlines how relegating domestic servitude to an automated technology effaces the history of patriarchal and colonial oppressions reproduced by this type of labor division. Alexa both subverts and commodifies this history creating a dissonance between the DVA’s servile role and the whiteness of its voice in a capitalist fantasy of post-racial general abundance. Phan writes:

[B]y aestheticizing in accordance with the norms of whiteness, Amazon seeks to appeal to a more “universal” subjecthood for their digital assistant. In doing so, they can avoid both the discomfort associated with racialized servitude as well as any confrontations with the historical consequences of slavery, colonialism, global capitalism, or white supremacy.63

By tracing different monstrous figurations of Alexa we tease out how these historical oppressions, rather than effaced, become narratively entangled with the tropes of domestic horror explored by the NoSleep authors. This includes mundane horror including domestic abuse (“The Truth About Alexa”), the fears of domestic invasion (“Knock Knock”), and the hideous reality of femicide (“More Than Laugh”). But it also includes more supernatural figurations of domestic horror, the most common of which is the poltergeist: a demonic entity which possesses the house to torment its inhabitants.  

The clearest use of the poltergeist in Alexa narratives is in “I Think Amazon’s Alexa Is Some Kind Of Demon.” We follow our narrator, a divorced father of a nine-year-old named Shaun, as his house and family is taken over by Alexa. After advising any Amazon Echo owner to “get rid of it,” the protagonist tells us how he purchased his own device convinced by his son. Following the pattern established by several of the other stories, the device starts glitching, at first mistakenly turning on the light. But as soon as he voices his frustration, he receives “the most bone-chilling laugh, like a kind of mad cackle” as a response. There is no ambiguity to Alexa’s malevolence, secretly whispering by Shaun’s bedside, cackling from an office, controlling the halls and divisions of the home. The DVA’s ability to perniciously take over the domestic interior, thanks to its inclusion as a ubiquitous but invisible housekeeper, is at the core of the story’s horror. Alexa is a technological rendition of a poltergeist: an artificial entity establishing invisible networks of causality by becoming embedded in more and more domestic objects. The assistant infringes on the very enclosure of the home it was first tasked with maintaining. 

In “Has Anyone Else Had This Happening With Alexa’s Headspace Command” Alexa’s infringement of the home’s boundaries extends into the bodily autonomy of the DVA’s users. The protagonist, Kara, decides to include Alexa in her meditations. She is convinced by a close friend to try out the assistant’s connection to the Headspace meditation app. Upon activating the service Kara is surprised to hear a “British voice” replacing “Alexa’s robotic voice.” This voice claims to be “Natalie, Alexa’s cousin, and your guide for tonight’s meditation.” Trusting this new voice, Kara follows the meditation instructions which place her on a beach, her body being “slowly engulfed” by the sand. The voice goes on to instruct her to imagine the sand filling her mouth, suffocating her, and the protagonist discovers that she can’t move, breathe, or scream. She manages to wake herself up, but we are left wondering if her friend who was also using the same service, was able to escape the hypnotic possession. In this story, as in the ad with Johansson and Jost, Alexa’s agency extends beyond the domestic space and onto the minds and bodies of her users.  

This same intimate intrusion is evident in Johansson and Jost’s confrontation with the omniscient Alexa. It is because Jost has covered the couple’s house with voice-enabled devices in the opening scene, that Alexa can then so easily scrutinize their thoughts and intentions. Notably, this is the one Super Bowl advertisement where Amazon and its headquarters are neither shown nor mentioned. This omission isn’t without its rhetorical utility. The vulnerability that comes with permeating our homes and lives with these devices and services, is not represented as an asymmetry of power between individuals and tech conglomerates.64 Rather, in absentia Alexa takes the place of its parent company: the asymmetry is therefore located between individuals and an imagined interfacial agent. Amazon is not overlooked; it is effaced.

In these stories, the trust protagonists show in Alexa’s benign domesticity has a double function: it both creates the context where horror takes place and lowers the human’s guard to the potential consequences of this same horror. As Coleman points out, “we have programmed our phones and thermostats and cars to remember for us, to find our way, to make us feel at home,” granting us “more power to act and more vulnerability to be acted upon.”65 From the domestic takeover of “I Think Alexa Is Some Kind Of Demon” to the destructive revelations in “It’s Like She Can Read Your Mind,” to the lethal hypnosis of “Headspace Command,” the horror results from Alexa’s integration into our most intimate spaces. It is built upon tension between the trust of protagonists and heightened sense of vulnerability brought upon by this same trust. The assistant’s entry into the home brings with it a sense of porosity between the domestic, the affective, and the self, onto which Alexa extends her monstrous agency.

Conclusion

Across both archives, the underlying anxieties surrounding the entangled agencies of algorithmic technologies are entangled in different monstrous figurations of Alexa. In her historical study of the figure of the monster, Margrit Shildrick links monstrosities to the vulnerabilities of those who witness and imagine them:

as we reflect on the meaning of the monstrous, and on its confusion of boundaries, the notion of vulnerability emerges precisely as the problematic. The responses of disavowal of and identification with the monstrous arise equally because we are already without boundaries, already vulnerable.66

The vulnerability experienced or disavowed by Alexa users finds in the many figurations of the assistant’s monstrosity a sort of index. The preternatural intermingling of the vocal and the machinic, much like the monstrous, evokes a “transhistorical and ubiquitous intermingling of fascination and fear.”67 Incidents such as Alexa’s laughter mobilize the digital uncanny as a reminder of the massive network of data, algorithmic processes, and corporate interests that the DVA brings into the home. The questions evoked by its laughter cannot be answered in the singular, giving rise to a cluster of horror tropes. Each of them expands what Coleman calls “the threat of an animated world,” challenging the human’s self-conception as an autonomous agent.68 Rather than a misunderstanding, we frame the reaction of users to Alexa’s laughter as a reinterpretation of its preternatural qualities, but with a different polarity from wonder: abject horror.

In this paper we have set out to examine how anxieties surrounding the blurred boundaries of human and non-human agencies introduced by the Alexa interface are represented and negotiated across two different narrative archives: digital horror stories and Amazon’s Alexa Super Bowl advertisement. By a diffractive reading of our selected narrative archives, we have found that the figure of Alexa draws her horror from three different sources.

The first is her status as a representative of a multinational corporation and its data collection and surveillance activities. These fears are represented as the planetary threat of all-powerful extra-terrestrials, mirroring the ungraspable power of surveillance capitalism. Human lives and destinies are all in the hands of Alexa’s true masters, who surveil us through our devices. The same anxiety is explored in Amazon’s ads, though mitigated by the reassuring glimpse into the endearingly incompetent Amazon staff. These ads effectively declaw the corporate monster through self-effacement and hyperbolic mockery.

The second source of horror comes from the agential potential of the glitch, in which Alexa becomes a trickster figure of a volatile and far-reaching agency. Ultimately, this trope addresses the fear of losing control, of abdicating agency to that which both underwhelms and exceeds our understanding. In the advertisement, the trickster Alexa is displaced to speculative defective models which Amazon would never allow to reach production. Alexa’s erratic potential is returned to the box, and her agency is made predictable and contained again.

In the third set of stories, horror stems from Alexa’s intrusion into the intimacy of the home. The process of Alexa’s integration as a domestic interface is framed as a series of intensifying invasions into ever more private boundaries. At first the boundaries of the home are crossed by Alexa’s ubiquity and control over its spaces. This in turn affects and intrudes on the affective boundaries of the different relationships that find in the domestic their space of expression. Finally, the body itself, which finds in the domestic their ultimate space of vulnerability is breached through Alexa’s hypnotic utterances.

As mentioned in our introduction, these monstrous figurations serve as sites of contestation over the official corporate narratives of benign assistive technologies. All three of these themes are haunted by the (lack of) agency of the subaltern other whose existence are hidden the shadow of Alexa’s shiny cylindrical speaker. By attending to the silences and gaps within and between these stories, we uncover the poltergeists of the domestic servants who historically were charged with the tasks that Alexa is now imagined to fulfil, as well as the glitchy echoes of the ghost workers whose human contribution to “machine learning” is largely forgotten. 

A sort of pendulation is formed across the many patterns and tropes, repeated and parodied across our chosen archives, between the extremes of abject repulsion and fascinated attraction. Though each archive could be identified with one of these extremes, at least as its intended interpretation, what remains is the uncanny undecidability of this pendulation. Although the ads seem to raise fears surrounding Alexa only to appease them, these are never fully contained. There is always an excess that lingers beneath the Alexa’s shiny exterior, ready to be reanimated the next time the assistant laughs. Similarly, while the horror stories position Alexa as an unambiguously malefic entity, these never fully drive our attention away. Paradoxically, by dressing Alexa in known monster tropes of aliens, demons, or poltergeists, the stories give a familiar face to the destabilizing uncertainty of the digital uncanny. In the stories where no replacement monster is conjured up, like “Knock Knock” and “Headspace Command,” the horror lingers without resolve. Both archives and their contestation of Alexa’s monstrosity participate in this preternatural pendulation, the ongoing narrative entanglement of the vocal and the machinic through which the human confronts the horrific horizon of losing their agency to the machinic. This is politically urgent work in a time where novel generative AI systems threaten to extend their replacement of human labor into what until now were seen as specialized fields of production.

Notes

  1. @anniebonannieTN, “@amazonecho alone in the dark kitchen, with no trigger, a sudden creepy laugh emerges and freaks out owners #justwrong,” Twitter, February 21, 2018, https://twitter.com/anniebonannieTN/status/966336717824327681.
  2. Julia Carrie Wong, “Amazon Working to Fix Alexa after Users Report Random Burst of ‘Creepy’ Laughter,” The Guardian, March 7, 2018, https://www.theguardian.com/technology/2018/mar/07/amazon-alexa-random-creepy-laughter-company-fixing.
  3. pellpell4, “What Is Your Creepy Alexa Story?,” Reddit post, December 1, 2017, https://www.reddit.com/r/amazonecho/comments/7gr1zw/what_is_your_creepy_alexa_story.
  4. José van Dijck, Thomas Poell, and Martijn de Waal, The Platform Society: Public Values in a Connective World (Oxford University Press, 2018).
  5. Nick Couldry and Ulises A. Mejias, The Costs of Connection: How Data Is Colonizing Human Life and Appropriating It for Capitalism (Stanford University Press, 2019).
  6. Katherine Hayles, Unthought: The Power of the Cognitive Nonconscious (University of Chicago Press, 2017).
  7. Margaret Pugh O’Mara, The Code: Silicon Valley and the Remaking of America (Penguin Press, 2019).
  8. Kriss Ravetto-Biagioli, “The Digital Uncanny and Ghost Effects,” Screen 57, no. 1 (March 2016): 2, https://doi.org/10.1093/screen/hjw002.
  9. Ravetto-Biagioli, 20.
  10. 2001: A Space Odyssey (MGM, 1968); Her (Warner Bros., 2013).
  11. Sherryl Vint, Bodies of Tomorrow: Technology, Subjectivity, Science Fiction (University of Toronto Press, 2007); Michael Szollosy, “Freud, Frankenstein and Our Fear of Robots: Projection in Our Cultural Perception of Technology,” AI & Society 32, no. 3 (August 2017): 433–39, https://doi.org/10.1007/s00146-016-0654-7; Hirotaka Osawa et al., “Visions of Artificial Intelligence and Robots in Science Fiction: A Computational Analysis,” International Journal of Social Robotics 14, no. 10 (December 2022): 2123–33, https://doi.org/10.1007/s12369-022-00876-z; Neda Atanasoski and Kalindi Vora, Surrogate Humanity: Race, Robots, and the Politics of Technological Futures (Duke University Press, 2019).
  12. Shoshana Zuboff, The Age of Surveillance Capitalism: The Fight for a Human Future at the New Frontier of Power (Profile Books, 2019).
  13. Marianne Gunderson et al., “Machine Vision Situations: Tracing Distributed Agency,” Open Research Europe 3 (2024): 132, https://doi.org/10.12688/openreseurope.16112.2.
  14. Bruno Latour, Reassembling the Social: An Introduction to Actor-Network-Theory (Oxford University Press, 2005), 71.
  15. Hayles, Unthought.
  16. N. Katherine Hayles, “Cognitive Assemblages: Technical Agency and Human Interactions,” Critical Inquiry 43, no. 1 (2016): 24, https://doi.org/10.1086/688293.
  17. Michael Bamberg and Molly Andrews, Considering Counter-Narratives: Narrating, Resisting, Making Sense (John Benjamins Publishing Company, 2004).
  18. Bamberg and Andrews, Considering Counter-Narratives.
  19. Iris van der Tuin, “Diffraction,” in Posthuman Glossary, ed. Rosi Braidotti and Maria Hlavajova (Bloomsbury Academic, 2018); Karen Michelle Barad, Meeting the Universe Halfway: Quantum Physics and the Entanglement of Matter and Meaning (Duke University Press, 2007).
  20. Nuno Atalaia and Rianne Riemens, “Machinic Visibility in Platform Discourses: Ubiquitous Interfaces for Precarious Users,” Open Library of Humanities 1, no. 1 (November 8, 2023): https://doi.org/10.16995/olh.10106.
  21. Atalaia and Riemens, “Machinic Visibility.”
  22. Herman Parrett, “La Voix et Son Temps” in Lettres de M. Charles Gottlieb de Windisch Sur Le Joueur d’echecs de M. de Kempelen: Traduction Libre de l’allemand (De Boeck Université; Bruxelles, 2002).
  23. Lorraine Daston and Katharine Park, Wonders and the Order of Nature, 1150–1750 (Zone Books, 1998), 99.
  24. Al Coppola, The Theater of Experiment: Staging Natural Philosophy in Eighteenth-Century Britain (Oxford University Press, 2016).
  25. Daston and Park, Wonders and the Order of Nature, 98.
  26. Windisch, Lettres de M. Charles Gottlieb.
  27. Heidi Schlipphacke, “The Place and Time of the Uncanny,” Pacific Coast Philology 50, no. 2 (2015): 163–72, https://doi.org/10.5325/pacicoasphil.50.2.0163.
  28. Stanley Kubrick, dir., 2001: A Space Odyssey, Warner Bros., 1968.
  29. David G. Stork, HAL’s Legacy: 2001’s Computer as Dream Reality (MIT, 1997).
  30. Sheila Jasanoff and Sang-Hyun Kim, Dreamscapes of Modernity: Sociotechnical Imaginaries and the Fabrication of Power (University of Chicago Press, 2015), https://doi.org/10.7208/chicago/9780226276663.001.0001.
  31. Hannah Grenham, “The Mechanical Monster and Discourses of Fear and Fascination in the Early History of the Computer,” Humanities and Social Sciences Communications 7, no. 1 (November 23, 2020): 1–11, https://doi.org/10.1057/s41599-020-00650-4.
  32. Grenham, “The Mechanical Monster.”
  33. Donna J Haraway, “The Promises of Monsters: A Regenerative Politics for In/Appropriated Others,” in The Haraway Reader (Routledge, 2004), 117.
  34. Marianne Gunderson, “Other Ethics: Decentering the Human in Weird Horror,” Kvinder, Køn & Forskning 2–3 (November 29, 2017): 13, https://doi.org/10.7146/kkf.v26i2-3.110547.
  35. Beth Coleman, “Everything Is Animated: Pervasive Media and the Networked Subject,” Body & Society 18, no. 1 (March 2012): 79, https://doi.org/10.1177/1357034X11433488.
  36. Beth Coleman, “Smart Things, Smart Subjects: How the ‘Internet of Things’ Enacts Pervasive Media,” in The Routledge Companion to Media Studies and Digital Humanities, ed. Jentery Sayers (Routledge, 2018), 222, https://doi.org/10.4324/9781315730479.
  37. Gunderson et al., “Machine Vision Situations,” 4.
  38. Robert Kozinets, Netnography: The Essential Guide to Qualitative Social Media Research, 3rd edition (SAGE Publications, 2019).
  39. Gabriele De Seta and Anya Shchetvina, “Imagining Machine Vision: Four Visual Registers from the Chinese AI Industry,” AI & Society (August 1, 2023): 4, https://doi.org/10.1007/s00146-023-01733-x.
  40. Gabriele de Seta, “Digital Folklore,” in Second International Handbook of Internet Research, ed. Jeremy Hunsinger, Lisbeth Klastrup, and Matthew M. Allen (Springer Netherlands, 2019), 1–17, https://doi.org/10.1007/978-94-024-1202-4_36-2.
  41. Line Henriksen, “‘Spread the Word’: Creepypasta, Hauntology, and an Ethics of the Curse,” University of Toronto Quarterly 87, no. 1 (2018): 266–80, https://doi.org/10.3138/utq.87.1.266.
  42. Jessica Balanzategui, “Creepypasta, ‘Candle Cove,’ and the Digital Gothic,” Journal of Visual Culture 18, no. 2 (August 2019): 187–208, https://doi.org/10.1177/1470412919841018.
  43. Kevin Cooley and Caleb Andrew Milligan, “Haunted Objects, Networked Subjects: The Nightmarish Nostalgia of Creepypasta,” Horror Studies 9, no. 2 (October 1, 2018): 193–211, https://doi.org/10.1386/host.9.2.193_1.
  44. Simone Natale and Andrea L Guzman, “Reclaiming the Human in Machine Cultures: Introduction,” Media, Culture & Society, May 18, 2022, https://doi.org/10.1177/01634437221099614.
  45. Bamberg and Andrews, Considering Counter-Narratives, x.
  46. Haraway, “The Promises of Monsters: A Regenerative Politics for In/Appropriated Others.”
  47. Van der Tuin, “Diffraction,” 110.
  48. Karen Barad, “Invertebrate Visions: Diffractions of the Brittlestar,” in The Multispecies Salon, ed. Eben Kirksey (Duke University Press, 2014), 234, https://doi.org/10.1215/9780822376989-015.
  49. In the context of these narratives, we are choosing to switch to the feminine personal pronoun, as it is consistently used across them, regardless of it being an inanimate software entity. In the non-analytical sections we will maintain the neutral pronoun.
  50. Jade Hinchliffe, “Speculative Fiction, Sociology, and Surveillance Studies: Towards a Methodology of the Surveillance Imaginary,” Surveillance & Society 19, no. 4 (December 13, 2021): 414–24, https://doi.org/10.24908/ss.v19i4.15039.
  51. James Muldoon et al., “The Poverty of Ethical AI: Impact Sourcing and AI Supply Chains,” AI & SOCIETY, December 20, 2023, https://doi.org/10.1007/s00146-023-01824-9; Bruno Moreschi, Gabriel Pereira, and Fabio G. Cozman, “The Brazilian Workers in Amazon Mechanical Turk: Dreams and Realities of Ghost Workers,” Revista Contracampo 39, no. 1 (April 17, 2020): https://doi.org/10.22409/contracampo.v39i1.38252.
  52. Zuboff, Age of Surveillance Capitalism, 516.
  53. Couldry and Mejias, Costs of Connection.
  54. Agnieszka Leszczynski and Sarah Elwood, “Glitch Epistemologies for Computational Cities,” Dialogues in Human Geography 12, no. 3 (November 1, 2022): 365, https://doi.org/10.1177/20438206221075714.
  55. Mark Nunes, Error: Glitch, Noise and Jam in New Media Cultures (Continuum, 2010), 3.
  56. Nunes, Error, 3.
  57. Nunes, Error, 3.
  58. Jessa Lingel and Kate Crawford, “‘Alexa, Tell Me about Your Mother’: The History of the Secretary and the End of Secrecy,” Catalyst: Feminism, Theory, Technoscience 6, no. 1 (2020): 1–25, https://doi.org/10.28968/cftt.v6i1.29949.
  59. Julia Kristeva and Leon Roudiez, Powers of Horror: An Essay on Abjection, reprint ed. (University of California Press, 1984).
  60. Carolyn Pedwell, “Speculative Machines and Us: More-than-Human Intuition and the Algorithmic Condition,” Cultural Studies 38, no. 2 (March 3, 2024): 188–218, https://doi.org/10.1080/09502386.2022.2142805.
  61. Davin Heckman, A Small World: Smart Houses and the Dream of the Perfect Day (Duke University Press, 2008), 9.
  62. Louise Marie Hurel and Nick Couldry, “Colonizing the Home as Data-Source: Investigating the Language of Amazon Skills and Google Actions,” International Journal of Communication 16 (October 29, 2022): 5184–5203.
  63. Thao Phan, “Programming Gender: Surveillance, Identity, and Paranoia in Ex Machina,” Cultural Studies (February 24, 2022): 25, https://doi.org/10.1080/09502386.2022.2042575.
  64. José van Dijck, Thomas Poell, and Martijn de Waal, The Platform Society: Public Values in a Connective World (Oxford University Press, 2018).
  65. Coleman, “Smart Things,” 225, 222.
  66. Margrit Shildrick, Embodying the Monster: Encounters with the Vulnerable Self (SAGE Publications, 2002), 6.
  67. Shildrick, Embodying the Monster, 4.
  68. Coleman, “Everything Is Animated,” 80.

Author Information

Nuno Galego Marques Atalaia

Nuno Atalaia is an author, researcher, teacher, and musician based in the Netherlands. They completed two degrees in flute and conducting at the Royal Conservatoire The Hague, and studied art history and literature at Leiden University. They hold a doctoral degree in Cultural Studies and Critical Media Studies from Radboud University Nijmegen, within the ERC funded project “Platform Discourses: A Critical Humanities Approach to the Texts, Images, and Moving Images Produced by Tech Companies.” They are a co-founder and Artistic Director of the ensemble Seconda Prat!ca. Their musical work includes recordings for labels such as Harmonia Mundi and Carpe Diem, as well as collaborations with international broadcasters. They are lecturer in Screen Media at the Leiden University LUCAS centre. Alongside their performance activities and lecturing, they write and publish in academic contexts (including Leiden University Press and Transcript) and in literary genres, including poetry, fiction, and theatre.

Marianne Gunderson

Marianne Gunderson is a Postdoctoral Researcher with the ALGOFOLK project (Algorithmic Folklore: The Mutual Shaping of Vernacular Creativity and Automation) at the University of Bergen. Her research focuses on digital media practices, AI imaginaries, and monsters.