Hacking the narratives of the digital
Addie Wagenknecht between error, play and futures to be rewritten
In an increasingly fast, uniform technological ecosystem shaped by dominant narratives, speaking about the future often means accepting what has already been decided. The work of American artist Addie Wagenknecht moves in the opposite direction: hacking the stories we tell about technology, intervening in systems from within, and exploring the margins where error, play and deviation become political tools and artistic material.
An artist and researcher, Addie Wagenknecht has worked for years at the intersection of technology, political imagination and feminist practices, moving across languages that range from hacking (intervening in and repurposing technological systems to alter their logics and uses) to performance art and the critique of digital systems.
Addie Wagenknecht’s (1981) is an American artist and researcher living in New York City and Liechtenstein. Her work explores the tension between expression and technology. She seeks to blend conceptual work with forms of hacking and sculpture and deals primarily with ideas relating to pop culture, feminist theory, and hardware. Previous exhibitions include Centre Pompidou, The Istanbul Modern, Whitechapel Gallery and The New Museum NYC – among others. She has collaborated with CERN, Chanel, Whitney Museum of American Art, and Google’s Art Machine Intelligence (AMI) Group. Her work has been featured in numerous books, and magazines, such as TIME, Wall Street Journal, Vanity Fair, Art in America, and The New York Times.
Her practice contributes to imagining unauthorised, malleable and collective futures, in which technology is not a destiny but a terrain of intervention. Deep Lab, a transdisciplinary project active until the late 2010s, was one of the spaces in which this research took shape, leaving a legacy that continues to inform her work.
In dialogue with HerTech, an European project dedicated to rethinking access to and representation of female students and workers in technological fields, we met Addie Wagenknecht, a key figure in a feminist practice spanning art and technology, to talk about error as a method, play as a radical gesture and the poetic urgencies running through the contemporary digital landscape.
Deep Lab project emerged from a strong idea: that the future is not something to observe but something to “engage with”. What does “taking part in the future” mean to you?
I suppose it means actively intervening in technological narratives, subverting them from the outside or operating within systems just enough to go unnoticed. Hacking, mixing1, co-creating in order to insert a perspective that refuses to leave Silicon Valley—or market forces—the task of dictating the future.
It’s about malleability: the future as collaboration or prototype, not as a finished product.
Since around 2010, after the decision to dissolve Deep Lab with co-founder Maral Pourkazemi, I’ve continued to facilitate workshops and research that echo the ethos of the people who were part of it, but today I carry this forward through individual and collaborative works across multiple spaces both geographically and technically. These projects investigate new alignments within a changing culture, where values are increasingly shaped by algorithms and podcasters. It is not by chance that in Self-Design and Aesthetic Responsibility, philosopher Boris Groys2 describes what is happening as a form of ethical self-curation—an era in which the aura of the artwork has dissolved into reproducibility.
Deep Lab
Deep Lab was a transdisciplinary and collective project founded in December 2014 on the initiative of Addie Wagenknecht as a critical response to contemporary digital culture and peer-to-peer culture. It consisted of an international group of artists, researchers, writers, engineers and cultural producers working together to explore and question issues such as privacy, surveillance, code, art, social hacking, capitalism, race, anonymity and 21st-century infrastructure. The collective was supported by Carnegie Mellon University’s Studio for Creative Inquiry under the direction of Golan Levin and the Warhol Foundation, where, during a week-long incubator, it produced a 240-page book, the Deep Lab Lecture Series, and a documentary that collects contributions and reflections from participants, as well as outlining an extended programme of workshops, education, and “hacktivism”. The project involved numerous internationally recognised voices from journalism, digital activism, academia, and art, and existed as a community of practice until at least 2019.
See moreGo to the official website
How does your recent artistic practice align with this cultural transformation, in which value, not only of the artwork but also of identities, is increasingly mediated by algorithms, platforms and practices?
In Honest Day’s Work (2024), I take part in this reflection by repurposing blockchain and freeports3 into a performative sculpture in perpetual transit: collectors scratch lottery tickets for a chance at ownership. The work exposes the absurdities of the art market—from tax havens to speculative risk—and is Duchampian4 in that it transforms financial choreography into an artistic medium. What interests me is making systems that are normally opaque—both technological and artistic—visible and legible, so that more people beyond the 1% can benefit from them5. My work is therefore both an educational act and a political statement.
Beginning with Alone Together (2018)6this trajectory has been grounded in embodied hacks7, such as manipulating Roomba vacuum robots to critique isolation in the domestic lives of mothers. The goal is to transform seemingly neutral devices into tools that reveal the power relations, social gender dynamics, and systems of value that structure them. Today this work extends to the “placelessness” of blockchain, which produces value and ownership without anchoring them to a physical space or specific jurisdiction, creating an apparently neutral environment that in fact obscures responsibility and power—while inviting active participation in futures where technology serves vulnerability rather than exploitation.
Your work pays close attention to female genealogies of technology. Why is this perspective so important, and how can it change the way we design the digital today?
Female genealogies—think of Ada Lovelace’s8 poetic machines or the coded revolutions of the women who programmed ENIAC9, the first programmable electronic digital computer in history—are the hidden threads of technology. They emphasize care, awareness, and relationality rather than domination. They are essential for dismantling the myth of innovation’s bro-culture10, revealing how women have been erased in order to sustain the patriarchal efficiency of systems designed by men, for men. This perspective can redefine digital design through a different hierarchy of priorities: interfaces that models based on the extraction of value from users in favor of sustainability and Return on Investment (ROI). In this context, self-design11 becomes an ethical issue: when identities are shaped through interfaces and algorithms, the way digital systems are designed directly affects how subjects represent themselves, assign value to themselves, and become interchangeable.
Redefining the digital today means infusing these genealogies into AI systems, working with non-linear structures that value imperfection over ideals of perfection.
I think the reason AI threatens so many mediocre middle-aged men is tied to a culture of being “confidently wrong,” while using podcasts and platforms like “X” (Twitter) as sources of absolute truth. AI is no different—it’s just faster and sexier.
Your practice often embraces failure, deviation and glitch, i.e. errors, malfunctions or anomalies in technological systems that produce unexpected results. What role does error play as a creative tool or a feminist practice?
Error is the system’s confession. It exposes its flaws and, in doing so, humanizes technology by rejecting the ideal of perfection in favor of vulnerability. As a creative tool, error opens non-linear paths and makes possible more humane relationships, where disorder is celebrated instead of efficiency.
In Deep Lab, we worked with what we called “Error Parties”: moments of collective experimentation in which code was deliberately pushed to its limits until it collapsed into visual and textual forms resembling manifestos. In these contexts, glitches were not problems to fix, but final aesthetic goals. Embracing failure thus becomes a form of resistance: glitches reveal biases and make visible the asymmetries usually hidden within technological systems.
The, with Honest Day’s Work (2024), I explore the idea of failure through lotteries: scratching the ticket destroys it, yet it is precisely this gesture that makes winning possible. It’s a gesture that echoes the suspended risk of freeports12 and imagines technology as something iterative and non-linear—a process made of attempts, discards, and possibilities, rather than optimization and guaranteed success.
In your work, from the creative use of drones as painting tools in Black Hawk Paint (2007) to routers turned into digital graffiti in WifiTagger (2013), play, curiosity and freedom of exploration recur often. What role do playful experimentation play in your approach to technology? Can they become forms of resistance and/or artistic material?
Play is a subversive alchemy: when it works, it transforms the grind of technology into joyful rupture.Curiosity seeps through the cracks of automation. Pleasure and experimentation reclaim agency13 from mechanisms of surveillance and control, turning rigid algorithms into expressive canvases. Play becomes a form of resistance by inserting absurdity into systems of control, and artistic material when code is remixed like poetry or failures like graffiti.
It’s a way of nurturing a collective imagination within an ecosystem dominated by capitalist extraction.
One project that embodies this attitude is LOL (Liberation of Lulz) (2018) by Deep Lab: an exercise in radical play where memes and digital culture were remixed into feminist hacks, such as AI fortune-tellers designed for subversive fun, capable of holding critique and pleasure together. This kind of play, for me, is a way of restoring self-design. As the “ELIZA effect”—described by Rob Horning14— shows, our relationship with machines is full of projections and illusions. Play intervenes precisely here: instead of reinforcing these projections, it makes them visible, exposing technological interaction through a form of conscious roleplay in which the user regains agency and critical distance. In Alone Together (2017), the blue trails left by a Roomba moving around my body become playful obstructions: they produce a “non-selfie” of a body that is no longer present, while simultaneously critiquing the technology-induced isolation described by sociologist Sherry Turkle15. Here play acts as resistance because it humanizes, invites the exploration of vulnerability, and opens relational spaces where there was previously only automation.
You often explore less visible areas of the digital, spaces considered peripheral or “unofficial”. What kind of aesthetic or research experience can emerge in these places? Do they still offer glimpses of freedom compared to an increasingly uniform and commercial web?
Margins generate “glitched” aesthetics—ephemeral, non-monetized—that emerge from encrypted digital flows. By margins I mean darknets, forgotten P2P networks, mesh grids16 set up in abandoned warehouses, or even the hidden side of blockchains and AI training datasets that never reach mainstream feeds. These are the irregular geographies I mentioned earlier, places that resist the polished uniformity of commercial platforms.
In my work, exploring these margins is not just a research practice: it’s a form of poetic insurrection. Here aesthetics emerge raw and unfiltered—ephemeral soundscapes generated by encrypted traffic, visual debris from corrupted data streams, hallucinatory renders that glitch into something profoundly human. It’s not about building a seamless user experience, but about seeking beauty in deviation: jagged, provisional forms that challenge the algorithmic demand for continuity and perfection.
Do they still offer spaces of freedom? Absolutely yes, even though they are shrinking and increasingly under siege.
In a web dominated by algorithmic feeds—which, as Rob Horning notes, «import biases and tempt us with passive optimization»—these margins remain vital refuges. They allow non-monetized creativity, anonymous collaboration without surveillance, and experiments that evade dragnet algorithms.
Which is the technology that today feels “poetically urgent” to explore?
Artificial intelligence is urgent. It’s urgent because it challenges illusions and averages, and forces us to rethink the self as a networked entity. It compels us to stop seeing ourselves as isolated creators and instead as “glitched” nodes within vast, distorted networks.
In these systems, what artist Hito Steyerl17 calls “mean images” flatten our messy realities into statistical mediocrity, while emotional manipulation dynamics lead us to project intentions and sensibility onto code that does not possess them. The urgency, for me, lies precisely here: to subvert these systems in order to recover a more human and relational self within a technological context of continuous acceleration.
What do you hope young women, creatives and researchers will take from your work? And what kind of relationship with technology would you like them to feel free to build?
I hope they take audacity with them. That they build insurgent ties and are not afraid to fail. Failure, for me, is the only way we learn how to change and adapt.
I would like them to use technology as an extension of themselves, to push forward a future they truly believe is possible—and to find others who are looking for something better and who refuse to accept the present we’ve been given as it is.
- In the context of artistic and digital practices, mixing means breaking down and recombining existing materials – code, images, languages or infrastructures – to change their meaning, use or symbolic power. ↩︎
- Boris Groys is a philosopher and art critic who has long reflected on the relationship between aesthetics, technology, and subjectivity. In Self-Design and Aesthetic Responsibility, Groys analyses how, in the digital age, identity becomes an aesthetic and political project, exposing the self to dynamics of visibility, reproduction and ethical responsibility. Groys, B. (2008). Self-Design and aesthetic responsibility. E-flux Journal. Retrieved December 20, 2025, from https://www.e-flux.com/journal/07/61386/self-design-and-aesthetic-responsibility ↩︎
- Blockchain is a distributed ledger technology often associated with decentralisation and transparency. Freeports are duty-free zones, often used in the art market, where works can be stored without being subject to taxation or customs controls. They are a symbol of financial opacity, speculation, and inequality in the global art economy. ↩︎
- Duchampian is an adjective that refers to Marcel Duchamp’s conceptual approach: the work does not reside in the object, but in the gesture and the system of meanings it questions. In this sense, economic or financial processes can become artistic material. ↩︎
- Economic studies on inequality – starting with the work of Thomas Piketty – show how, in contemporary economies, returns on capital grow faster than income from labour, reinforcing mechanisms of accumulation that exclude the majority of the population from the benefits of financial and cultural systems. Ref. Piketty, T. (2014). Capital in the Twenty-First Century. Cambridge, Massachusetts: Belknap Press of Harvard University Press. https://dowbor.org/wp-content/uploads/2014/06/14Thomas-Piketty.pdf ↩︎
- In the Alone Together project, the artist modifies a robot vacuum cleaner (Roomba) transforming it into a pictorial and critical tool. Wagenknecht loads blue pigment (the infamous International Klein Blue) into the Roomba, reprogramming its behaviour so that it distributes colour instead of vacuuming. She lies naked on a white canvas while the robot circles around following its algorithm. The Roomba does not “feel” the body but avoids it as an obstacle, mapping the territory until it covers the surrounding surface. The result is a painting where the body does not appear as a presence but as an absence of colour, a void that draws the female form through the trace left by the robot. ↩︎
- Embodied hacks are artistic and critical practices that use the body as a tool for intervening in technology, highlighting how digital systems influence physical, emotional, and relational experiences. ↩︎
- The term “Lovelace’s poetic machines” refers to Ada Lovelace‘s idea that computing machines can operate not only on numbers, but also on symbols, relationships, and creative forms, opening up the possibility for imaginative and not purely functional uses of technology. Ref. Lovelace, A. A. (1843). Notes by the translator. In L. F. Menabrea, Sketch of the Analytical Engine invented by Charles Babbage (pp. 691–731). Richard and John E. Taylor. https://johnrhudson.me.uk/computing/Menabrea_Sketch.pdf
(Original published in 1843 in Scientific Memoirs, Vol. 3) ↩︎ - The term “revolutions codified by the women of ENIAC” refers to the pioneering contribution of the first ENIAC computer programmers, whose work defined fundamental concepts of modern programming, long excluded from the official narrative of the history of computing. Ref. Kleiman, K. (2022). Proving ground: The untold story of the six women who programmed the world’s first modern computer. Grand Central Publishing. ↩︎
- Bro culture is a cultural model prevalent in the technology industry, characterised by environments dominated by men, often white and Western, who value competition, aggression, ostentatious confidence, and individual success, producing environments that are not very inclusive and marginalising different perspectives. ↩︎
- The concept of self-design refers to the process through which individuals and groups design their own identity within technological, social, and economic systems. From a critical perspective, it raises questions about autonomy, control, commodification of the self, and ethical responsibility. ↩︎
- Freeports are high-security storage areas where valuable goods can be stored or traded without taxation, often associated with opacity and speculation in the art market. ↩︎
- In social theory, agency refers to the capacity of individuals to act independently, to make choices, and to influence social structures through their actions. Ref. The Constitution of Society
Giddens, A. (1984). The Constitution of Society: Outline of the Theory of Structuration. Cambridge: Polity Press. ↩︎ - The ELIZA effect describes the human tendency to attribute intentionality, emotions or consciousness to technological systems that actually operate through automated responses. Rob Horning uses this concept to criticise the illusions projected onto artificial intelligence and conversational technologies. ↩︎
- Sherry Turkle is an American sociologist and psychologist who has studied the impact of digital technologies on identity and relationships. Her work analyses how hyper-connection can produce new forms of isolation and emotional fragility. ↩︎
- These terms refer to alternative or less visible digital infrastructures: darknets are networks that can only be accessed using specific software, often associated with privacy and anonymity; P2P (peer-to-peer) networks are decentralised networks in which users share resources directly; mesh grids are distributed networks without a centre, often used in contexts of technological autonomy. ↩︎
- Hito Steyerl is an artist and theorist who has analysed digital visual culture. By “mean images”, she refers to the tendency of algorithms to produce standardised, statistical representations that flatten the complexity of human experience. ↩︎