Show Less
Full access

Digital Contagions

A Media Archaeology of Computer Viruses, Second Edition


Jussi Parikka

Now in its second edition, Digital Contagions is the first book to offer a comprehensive and critical analysis of the culture and history of the computer virus.
At a time when our networks arguably feel more insecure than ever, the book provides an overview of how our fears about networks are part of a more complex story of the development of digital culture. It writes a media archaeology of computer and network accidents that are endemic to the computational media ecology. Viruses, worms, and other software objects are not seen merely from the perspective of anti-virus research or practical security concerns, but as cultural and historical expressions that traverse a non-linear field from fiction to technical media, from net art to politics of software.
Mapping the anomalies of network culture from the angles of security concerns, the biopolitics of computer systems, and the aspirations for artificial life in software, this second edition also pays attention to the emergence of recent issues of cybersecurity and new forms of digital insecurity. A new preface by Sean Cubitt is also provided.
Show Summary Details
Full access

Introduction: The General Accident of Digital Network Culture

| xiii →


The General Accident of Digital Network Culture

As usual, everything negative remains untold, yet it is, interestingly enough, always there, in an embryonic form. How is it possible to state that technologies are being developed, without any attempt being made at learning about the very specific accidents that go with them?1

—Paul Virilio (1999)

Any information system of sufficient complexity will inevitably become infected with viruses: viruses generated from within itself.2

—Neal Stephenson, Snow Crash (1992)

Disease and Technology

The history of media and technology is a history of accidents. Things break down almost as often as they actually do what they are supposed to do. We fantasize about machines of ultimate communication and functionality, but the reality is different and goes back to a variety of technologies of movement, communications, and transmission. The train introduced the train accident, with the boat came the boating accident, and inherent in several techniques of data storage, such as papyrus, paper, and film, is the always present possibility of the erasure of information.3 Media are always embodied, and this ← xiii | xiv → embodiment is always related to the rather simple physics; things decay and rot; any communication medium is vulnerable to passing of time as much as the speed that might bring about a crash. One could, indeed, write a whole shadow history of technology through how it does not work; how it breaks down; how it frustrates and messes things up; how it disappoints and does not meet up with the expectations that are always rather sublime compared to the mundane everyday. Already the telegraph, the early network technology of the nineteenth century, was vulnerable to such problems as perceived in 1846 by Dr. Barbay, a semaphore enthusiast:

No, the electric telegraph is not a sound invention. It will always be at the mercy of the slightest disruption, wild youths, drunkards, bums, etc. (…) The electric telegraph meets those destructive elements with only a few meters of wire over which supervision is impossible. A single man could, without being seen, cut the telegraph wires leading to Paris, and in twenty-four hours cut in ten different places the wires of the same line, without being arrested.4

As technological accidents have a history, so do diseases. Causes of diseases, which since the nineteenth century have been mainly recognized as harmful minuscule actors, infectious bacteria and viruses, affect the very basics of human societies. They accompany the so-called human world as the animals we choose to live with. And diseases are themselves conditioned by a variety of social mechanisms; the spatio-temporal occurrence of a disease, an outbreak, is always set in a field of definitions of causes, of sanitation measures, of security and other ways to control the territory. Diseases tell a story of society. Diseases are symptomatic of the ways cultures interact. They reveal paths of communication and commerce, of interaction and cultural hierarchies, which form the networks of a society: what affects what, who frequents whom and where, and so forth. Diseases expose.

The improvement of road networks from late antiquity exposed populations to a growing number of infectors. The Silk Road was such a network of goods as well as germs, from China to Europe and back. However, a major change occurred with urbanization, industrialization, and the new transmission channels of steam ships, railroads, and airplanes, which connected people and germs more efficiently than ever. The Black Death that infected Europe in the mid-fourteenth century was to a large extent an expression of the new paths of communication and transmission paved by the Mongol empire, as well as the new ship routes connecting the Mediterranean to the northern parts of Europe. In the nineteenth century, steam in the form of locomotives ← xiv | xv → and transoceanic ships helped microorganisms to create a single, near-global disease pool.5 Transmission and communication have always defined and transformed geopolitical boundaries, and any consideration of the global should be aware of the role germs play; geopolitics can be read through diseases and their paths. Ships are still in modern times the clandestine transport vehicles of diseases—whether the plague of psychoanalysis, as Freud quipped to Jung entering New York harbor, or the mythical figure of Count Dracula in Bram Stoker’s novel of 1897.6

Transmission and communication come with a cost—an anxiety of established borders becoming all mixed up and trespassed. At the end of the twentieth century, another form of technological accident caused an immense amount of public anxiety. It seemed to combine these lineages of technological accidents and disease patterns and produce a truly novel situation. Computer viruses and worms, as the latest occurrences in the history of diseases, caused something that was referred to even as “hysteria”—the key sexed disease of the previous fin de siècle. Viruses have become a sign of the fin de millennium, indications of the uncontrolled threats of late modern technological mass society alongside a range of other threats, such as Chernobyl, AIDS, and Ebola. Digital viruses threaten to turn digital memory banks into “terminal gibberish.”7 As Wolfgang Schivelbusch notes, before industrialization, accidents seemed to refer to “coincidences” that attacked from the outside, usually in the form of natural catastrophes, such as storms and floods. With industrial machines, the accident became an internal problem as technologies seemed to potentially turn on themselves as a result of their own power.8 The recent decades of global technological infrastructures have seen a range of technological accidents, perceived as risks to be minimized and managed, signified, and valorized in the (mass) media system (in the traditional definition of print media, television, and, for example, cinema). The utopia of the global village9 and the vast number of accounts celebrating networks and cyberspaces as the key utopian (non)places of recent decades showed their flipsides to be a dystopia of global risks and accidents.

Digital Contagions is a story of accidents but also an analysis of the historical conditions that made it possible to conceive and program such an entity as a “computer virus” or a “digital worm.” It is an attempt to write a media archaeology of computer and network accidents. As such, it asks: What are the material and incorporeal conditions of existence for such oxymoronic entities of the network age? Why does it make sense to talk of computer viruses, as if the computer and the network were alive and vulnerable to parasites—a ← xv | xvi → seeming force of nature but programmed by humans? Digital Contagions complexifies the technological definition and understanding of computer viruses through historical source work as well as theoretical arguments. It participates in the past years of software studies discussions about algorithmic culture but by way of a media archaeology of accidents.10 Throughout the book there is a very strong tendency to historicize, argue through historical sources, and use history (in the sense of archaeology) as a method of situated knowledge: to pinpoint strategies of knowledge as historical and contingent and to analyze within a meshwork of perspectival positions that resist “categorical a priori condition[s] of knowledge.”11

Computer viruses become read as part of the global digital culture, although through a specific source base that is mostly Anglo-American, consisting of popular cultural discourse, computer texts aimed at the general public, computer science texts, and security discourse. Instead of offering an insight to only the most recent events, I want to look at the early emergence of the issue, in the midst of the era when computers started to become more widespread and the discussion concerning networks was emerging, that is, mostly ← xvi | xvii → the 1970s through to the 1990s. As I argue in this book, I see that the crucial issues, discussions, and positions concerning viruses and worms were already visible before the mid-1990s, which is why I want to underscore these earlier threads and lineages in this discourse network.12 This means that I focus primarily on issues before the rise of the popular Internet in the mid-1990s, even if, as I argue below, it would be difficult to ignore the recent National Security Administration (NSA) revelations by Edward Snowden or the massive preparations by various national defense departments and industries for cyberwar.

Figure 1. During the past twenty years, various software and art projects have addressed computer viruses as a curious aesthetic and media cultural phenomenon. One such example was the net art virus from 2001, which was displayed on T-shirts and sold on CD-ROMs ($1,500 each). With the virus, computer code became a media performance. (Image reproduced with the permission, credits: Eva and Franco Mattes,, 2001, Computer virus.)

Digital Contagions analyzes the media archaeology of this specific computer accident as a symptom of a more abstract cultural diagram. Computer viruses and worms and other related software are not, then, antithetical to digital communications but, as the book argues, at the very center of network culture. Neither is the digital virus solely an internal computer problem. It is a symptomatic part of security cultures before and after 9/11, with the emergence of digital culture, the worries of new social dilemmas, and networking as the key reference point in discussions of computer science. The virus is also an expression of the media ecology and the so-called biological diagram of the computer, where the biological sciences are actively interfaced with computer science, often with a special emphasis on bottom-up emergence.13

Methodologically, I am following Michel Foucault’s ideas of (cultural) archaeology and genealogy.14 I proceed in the same manner when understanding the primary question to be one of cultural mechanisms, or forces, that give birth to the myriad ways in which we program, discuss, debate, control, insulate, and police also software. The question is not so much what a computer virus is, but how it came to be what it is. This historical questioning requires that we focus on the conditions of a thing, not its essence.15 In other words, what made computer viruses and like part of a computer problem, what gave rise to malicious software, and what other sorts of things does this general determination hide? What made computer viruses intelligible—in discourse and in practice—is a rather fundamental question that one could ask of many other formations of software too, especially the ones we take for granted. Hence, media archaeological excavation is committed to seeking the conditions of existence of cultural phenomena and the interplay between continuities and discontinuities. The theoretical underpinnings are always developed in a perspective, where temporality is seen as a polymorphous and overlapping layering, a coexistence of multiple temporal fields of longer and shorter durations.16

There are various stages to the emergence of self-reproducing programs, a theme that predates the issue of computer viruses. In the same manner as ← xvii | xviii → viruses can be said to be a possibility enabled by the computer architecture developed by John von Neumann,17 they are a part of the network culture of the late twentieth century, sprouting during the early years of computing in the 1940s, taking a more familiar shape since the 1960s and 1970s as networking became a key paradigm for computing, and bursting into bloom during the 1980s and early 1990s with personal computers, the Internet, and the consumerization of digital culture. My main temporal emphasis is on the period between the von Neumann machines ideated around the end of the 1940s and start of the 1950s and the early years of the popular Internet in the 1990s. A story of networks is one of the Cold War, and similarly the story of software is one of changing security emphases, techniques, and containments. Software is a continuation of war by other means—but not merely war: it is also a story of software as part of emergence of the corporate culture based on digital communications, securitization of business transactions as well as the private sphere, and implementation of new ways of thinking about the computer as a hub of activity that indeed sometimes even fed part of discussions of artificial life.

I insist on talking about viruses and such as a form of a general accident of digital culture. But, surely, they are not merely accidents and not even accidental because they are programmed by people with specific intensions in mind and goals they want to reach? Indeed, there’s more to the term “accident.” An accident, as Paul Virilio writes, is not in this context intended to mean the opposite of absolute and necessary, as Aristotelian metaphysics has maintained. Instead, accidents are an inherent part of an entity. Accidents are internal to technologies: “Since the production of any ‘substance’ is simultaneously the production of a typical accident, breakdown or failure is less the deregulation of production than the production of a specific failure, or even a partial or total destruction.”18 For Virilio, the generality of the contemporary accident stems from its specific temporality. Key contemporary accidents, whether the 9/11 attacks on U.S. soil, the 1987 stock exchange crash, the Gulf wars since the 1990s, and the financial meltdown since 2008 are events that touch the whole globe, even if with differing effects. Such incidents of simultaneity and ubiquity are illustrative of the cybernetic networks of global scale that take over the spatial world of human perception.19 Computer accidents share some similar features. Even if computer errors such as viruses can be argued to be just a specific form of technological accident, they reveal a whole bundle of issues having to do with security, accidents, and risk management embedded in issues of globalization, digitalization, and media culture. Yet, “general” does ← xviii | xix → not mean “universal” in the sense that I would claim viruses and worms as the only accidents of network technologies. Instead, we have a bunch of faulty code, bugs, and spam that shows that the viral is merely one situated perspective on the issue. Bots and zombie machines are one further security threat that extends the discussion from viruses to other manners of losing control in networks. Perhaps the control was never really there anyway. As Finn Brunton demonstrates in his excellent book on spam,20 bots are forms of software that do retain a relation to the human network of spammers and programmers, even if they automate a lot of labor involved. As anyone who is a frequent user of the Internet knows, it is not only for humans. The massive traffic of packets is one thing—one thing you are mostly unaware of, or as Wendy Chun reminds us,21 we just don’t know exactly what our computers are doing at a specific time, despite knowing some of the key theoretical points about them or even understanding programming. But another thing is the various forms of nonhuman agents that we sometimes treat as if humans, or sometimes they just steal that role. Bots do things for us, from floodbots that presented a form of excessive communication, which would have made the grounding figures of Dada proud, to the chat bots and zombie networks that redefine borders of inside and outside in computer culture. With the Internet of Things, a vast range of objects become vulnerable to such a loss of control, from the Jeep on the road to the household appliance22 to the more geopolitically significant targets, such as gas pumps.23

Following Brunton, one could even say that issues such as bots relate to the bigger issue of how “what is available to our robots (computers, sensors, algorithms) is distinct from what’s available to us, with our human brains.”24 Software triggers a plethora of issues, some of them even rather fundamentally ontologically and also operationally at the center of digital culture: What is perceivable? What is operational? What is automatable?

As the epigraph to this introduction by the fiction writer Neal Stephenson underlines, accidents can be understood as emergent phenomena on the level of socio-technological processes. Accidents are not external, but hypertrophical to the normalized functioning of a technological machine. Accidents do not merely break the rules of a system, but they actually use and bend them to their limit, exposing them along the way. Hence, it is not merely a question of malicious software25 that is of interest here, but the wider sense in which patterns of virality, contagion, and containment are in operation in digital culture and articulated in relation to security of economic flows, to political issues, and to the representational as well as nonrepresentational materiality of digitality. ← xix | xx →

Cultural techniques are increasingly digital techniques, whether we are referring to labor done in offices or the huge growth of digital entertainment services or the infrastructure necessary to enable computer networking. Manuel Castells has divided the rise of digital network society into several distinct steps. For Castells, key components within this history of digital culture were the following: new technical paradigms, such as ARPANET (1969); integrated circuits (1971); personal computers (1974–1976); the software revolution, UNIX (1974); TCP/IP-protocols (1973–1978), which made the Internet technologically possible.26 It is a short technological milestoning, but it provides an understanding of the material discourse networks with which cultural reality is intermingled. The material understanding of culture emphasizes that cultural meanings, articulations, perceptions, metaphors, statements, and discourses are not free-floating ideal signs and ideas; they are very deeply rooted in the information channels that make them possible. Meanings and significations happen always also in relation to material situations or discourse networks, which connect people, institutions, devices, and so on in a discursive and nondiscursive sphere.

The material characteristics remained immanent to incorporeal and abstract events on the socio-cultural and economic scales. The major crises of capitalism (1973–1975) and Soviet-led statism (1975–1980) coincided with the new technologies of digitality, a trend especially visible as information capitalism, which profited from the deregulation, liberalization, privatization, and globalization of this ascending world order, or “Empire” as Michael Hardt and Antonio Negri named it.27 In Empire they argue that this turn toward information capitalism took place in the 1970s and at the end of the Vietnam War, after which transnational companies spread across the globe, and digital flexibility and information networks expressed the material basis of this new regime. In other words, globally infrastructured networks became the key vectors of power, wealth, and control as the material discourse networks supported by multinational computer business corporations. This infrastructuring spanned from hardware to operating systems and software. As they famously phrased it, “The network itself is the site of both production and circulation,”28 hinting at the stakes in securing the uninterrupted working of the cross-national communications.

But it does not always take the work of cultural theory to recognize this. Sometimes these stories are told in the plain open and found in the archives. As Scientific American wrote in 1986, in the midst of the emerging digital boom and the coming age of the Internet: ← xx | xxi →

This is an age of proliferating connections: among people, among machines, among silicon chips within a machine, among transistors on a silicon chip. Information is processed by networks of chips and communicated through much vaster networks of copper cables and glass fibers by electrons and photons. There are alternatives to such networks: people can travel to meet one another face to face, and they can process data themselves rather than having computers do the job for them. But in most cases these are poor alternatives compared with what it is now possible to accomplish through electronic and photonic connections. As a result functions once carried out by people are increasingly being carried out by systems of electronic and photonic materials.29

It is against the backdrop of such valorizations and evaluations that software and also viruses as well as worms emerged; as disruptions of the new era of the networked silicon chip; an accident of information that was often defined in a rather confusing way; as a fear of technological accident that could wipe out massive amounts of data necessary to sustain normal everyday life. The Millennium Bug scare was one feature in this story; uncontrollable, sometimes malicious code another one. Virus was the rupture in the digital dreams of communication, and a rupture in the symbolic framework of the everyday that could tear apart that e-mail, that storage device, that privacy of the mobile device with the multiple levels of personal secrets and corporate communications you carry with you.30


Digital Contagions consists of three main sections: (I) fear and security, (II) body, and (III) artificial life (ALife). Section I, “Fear Secured: From Bugs to Worms,” engages with computer security discourses and practices, especially since the 1970s. Computer viruses appeared in the science fiction novels of the 1970s—for example, David Gerrold’s When HARLIE Was One (1972) and John Brunner’s The Shockwave Rider (1975)—but it was in 1983 that Fred Cohen, a computer scientist, engaged in experiments that marked the birth of a more sustained research into the computer science of viruses. Cohen understood the potentially malicious nature of these miniprograms, depicting them as forms of risk and danger, given the increasing reliance of institutions on computer infrastructure in the form of local area networks, plans for larger networks, software production, and so forth. The first chapter analyzes the technological, political, and economic tendencies in computer security, revealing how deeply the definitions of computer worms and viruses are embedded in issues of media, risk society, and (viral) capitalism. ← xxi | xxii →

Around 1985–1986, computer viruses reached the public debate for the first time and were from the onset described as “malicious miniprograms” and as threats to national security, international commerce, and the private user. The Pakistani Brain virus of 1986, Jerusalem of 1988, and Datacrime of 1989 were discussed widely in newspapers, but the 1988 Morris worm, or the “Internet worm,” as it has also been called, truly alarmed the computing community and parts of the general public, especially in the U.S. It became a topic of TV news and newspaper headlines, and it was discussed for years. Viruses and worms were no longer just self-spreading programs with which to test the coming age of network society; they were also loaded with a plethora of meanings, connotations, and articulations drawn from discourses of disease and disease control, crime, and international politics. Hence, what was only the imagination of some science fiction writers in the mid-1970s gathered a new intensity some ten years later.31 But fiction was much more than just a premediating shadow; it already articulated alternative insights to viral culture, acting as a sort of speculative design fiction of a computerized future world of nonhuman agents.32

In Section II, “Body: Biopolitics of Digital Systems,” I give special attention to the virus culture since the 1980s and how a more generic sense of embodiment was mobilized as part of an understanding of security, insides, and outsides. “Viruses” and “virality” became central cultural figures in the sense that tuberculosis had been in the nineteenth century and cancer was in the 1970s.33 Popular media culture became filled with parasites and viruses from the 1980s onward, pointing toward a new understanding of the openness of the body to its surroundings. The body became a possible vector of diseases that was also articulated together with information circulation. The virus and virality marked a state of liminality where the established borders of identity—whether biological or computational—became leaky. The human immunodeficiency virus (HIV) was at the center of numerous contested articulations concerning the actions, sexualities, gender, and ethnicities of human bodies; the computer virus (or the “computer AIDS,” as it was often referred to) inhabited several similar fields of struggle, on which contests over “proper use of computing,” “safe hex,”34 “digital hygiene,” and other key enunciations of what the software might operationally mean took place. Consequently, the second section addresses the discourses and practices of immunology, disease, healthy computing, and digital hygienics, particularly as they were expressed in the 1980s. Just as a judge pronouncing the word “guilty” transforms a person into a criminal, the various “pronouncements” of the heavily loaded term ← xxii | xxiii → “virus,” with all its associations, turns technological objects into malicious software, effecting incorporeal transformations.35 This section, then, adds to the more technological and corporeal themes mapped in the first section, using as its key concept the incorporeal transformations of a material body (of software).

Section III, “Life: Viral Ecologies,” expands the biological emphasis in order to ask questions about the media ecology of self-reproducing life in networks. Here I argue that technological evolution and the notion of self-spreading computer programs have much more widespread roots in the cultural history of modernity than the security-oriented notion a computer virus implies. Von Neumann’s theories of cellular automata from the 1940s, Norbert Wiener’s ideas of self-reproducing machines from the same age, the whole agenda of cybernetic automation of culture in the postwar era, the sciences of artificial life since the 1980s, and even the 1872 novel Erewhon by Samuel Butler all testify to this widespread cultural theme of technological life. In addition, by contributing a whole section of the book to artificial life, I want to emphasize the multiple contexts of computer viruses. Even though a major portion of the public discourse of computer worms and viruses has underlined the negative sides of these “malicious miniprograms” as products of criminals, vandals, and just badly informed youth, they actually have much more to say about technological culture. Hence, the key concepts or themes of the third section revolve around ecologies of networking, systems of complexity, and life as emergent relations and connections.

In the beginning of the 1990s, if not already earlier, the question of “beneficial computer viruses” emerged. The very same Fred Cohen who had in 1983 depicted viruses as potential threats to organized society reminded his readers of the positive possibilities these miniprograms have. Analyzing viruses in the contexts of nanotechnology, artificial life, and evolutionary software, he saw computer viruses not just as malicious and nasty pieces of program code.36 In a similar vein, Mark Ludwig sparked a whole range of discussion and critique with his books on viruses and artificial life. He even included instructions for coding such programs.37

It is no wonder that ideas about “third nature” emerged around the same period. As McKenzie Wark proposed, perhaps the digital culture of the late twentieth century is to be understood as nature, a third nature that supplements the two previous ones. As industrialization changed the biological and chemical first nature into a product and a standing reserve for the accumulation of wealth in a society, so the third nature is a continuation of this process of ← xxiii | xxiv → translating nature into terms of information technology. “Second nature” was something conceptualized by Hegel, Marx, and Lukàcs; Wark’s own analysis of the third nature of simulated digital culture offers a further vector to understand the layered ecologies. The telegraph began the process of bypassing the spatio-temporal coordinates of locality and sowing the seeds for a global network of virtual spaces, described metaphorically during the 1980s as “cyberspace” by several enthusiastic writers. With this new arena and nonplace of speed and telesthesia (“perception at a distance”), the speed and flow of information became the new earth, an artificial nature, a media sphere: “Second nature, which appears to us as the geography of cities and roads and harbours and wool stores is progressively overlaid with a third nature of information flows, creating an information landscape which almost entirely covers the old territories.”38 With the discovery of electromagnetism and the electric telegraph, messages were detached from the bodies of messengers and seemed to inhabit a life of their own. Globalization was tied to this vector of information that was itself tied to the transport routes and infrastructures. Goods, money, and people circulate, but also garbage, crime, and disease.39 This applies to the information nature of the third order as well: as AIDS has revealed the global patterns of disease in contemporary societies, so computer worms and viruses make visible the global tendencies of electrical nature that originated in the nineteenth century with the beginning of telecommunications, specifically the telegraph. The notion of “third nature” implies a multiplication of such hybrid objects in the spheres of telecommunications.40 Viruses and worms are not only an issue of security, but also of the wider “turbulent space” of the Internet, “a space suitable to the spread of contagion and transversal propagation of movement (from computer viruses to ideas and affects),”41 as Tiziana Terranova puts it.

Media Theory Meets Computer Technology: Definitions, Concepts, and Sources

On a technical layer, a digital virus is designed to attach a copy of itself to a host program. These have often been executable files (.exe or .com) but can also be found in data files, macroscripts, and, for example, the boot sector of a hard drive. The program is intimately tied to the infrastructures of storage and transmission of computer culture—something that has changed quite significantly over the decades. From transmission of floppy disks to hard drives ← xxiv | xxv → to the ubiquitous wireless culture, we are dealing with alternative vectors of contagion that do not equal “touching” in the usual sense. Traditionally, computer viruses have also included a “trigger” and a “payload.” This means, for example, that a virus in the age of discs might trigger after, say, 50 boots and then release its payload. These payloads varied: some old, classical viruses play a song, others format your hard disk, and some do nothing out of the ordinary. Some famous viruses have made letters fall off the screen one by one, imitated the “Yankee Doodle” tune, and printed insults. The viruses can be seen as a special form of the IF/THEN routine pair. The infection mechanism looks for infectable objects, and IF it finds them, THEN it infects them. The trigger can be set for a specific date, and IF it is reached, THEN the trigger is pulled.42

A virus attaches itself to other programs, but computer worms are more self-contained and do not need to be part of another program to propagate. Worms became a more central phenomenon in later years in network techniques of the World Wide Web, e-mail, and file sharing (although these techniques and worm programs date from the 1970s). Basically, and technically, viruses and worms are two different types of programs.43 Often they are referred to via the generic terms “malware” or “malicious software.” Malicious software, of course, includes a wide variety of different programs, such as Trojan horses in the form of spyware, botnets, loggers, and dialers. Several such programs are designed for commercial purposes. What is curious is that viruses spread across particularly homogeneous platforms with bad security features, which historically led to a lot of accusations concerning the Microsoft Windows operating system. The corporate image of the company probably enticed virus writers to target Windows, but the socio-technical characteristics are worth noting. The Windows operating system has become notorious for its security flaws, which seem recurrently to pile on top of each other with the massive penetration of this specific operating system across the world. In addition, compared with open source projects, which more efficiently employ user input in improving design flaws, the Microsoft model seemed rigid and inefficient.

It is important to note how fundamental these issues of definition are concerning the life of software. If the emphasis is placed on reproduction routines, virus- and worm-like programs cannot be said to be malicious by definition. Often it is taken for granted that computer viruses are malicious by nature and that their payload is always harmful. Yet, as several writers such as Fred Cohen have argued, virus-like programs can contain various types of payloads and perform several different routines in addition to their fundamental nature as semiautonomous and self-reproductive software.44← xxv | xxvi →

Digital Contagions can be characterized as cultural analysis that works with historical material and sources, which are introduced separately at the beginning of the bibliography.45 Thinking is thinking with something, and for a media archaeologist, historical sources are material to think with.46 Thus, as post-structuralist cultural theories of recent decades have instructed us, theories and empirical sources are deeply intertwined, and it makes no sense to return to questions that separate the thinking from the material world in which it is embedded. Already the empirical sources on viruses, namely, discourse and practices, include implicit premises and theories of what viruses are.47 It is a matter neither of giving form to a corpus of sources nor of submitting oneself to the historical “facts,” but of following the tendencies inherent in the sources. The methodological task is not merely to look for popular representations of viruses or to seek out the scientific truth of these technological products, but to look at the frames themselves—the where, why, and how these truths are produced: What are the technological, cultural, and historical frameworks or the discourse networks in which such positions function and interact? The function of media archaeology is to look at the immanent strategies producing reality.48

There is an ongoing tension, then, between underlining the emergent features of the phenomenon at hand (which is well illustrated in the epigraph by Stephenson) and analyzing the actors contributing to constructing the thematics of viruses and “malicious” software. Basically, this is an irresolvable tension, as I want to demonstrate that although different actors and actor-networks have been actively trying to pin down the meanings and uses of viruses, worms, and similar programs to serve their own ends, the phenomenon cannot be reduced to merely part of such strategies of signification and valorization. In other words, there is a constant danger of homeostatic biologism that would remain on an exclusively metaphorical level of analysis.49

A large proportion of the analysis in this book concerns U.S. society and its particular contexts of massive computerization since World War II, the atmosphere of Cold War fear and paranoia, and the context of global consumer capitalism since the 1970s. However, my perspective is not on national technological cultures but on cultures of global technology. The media technologies of modernization—from the telegraph to the digital networks of contemporary society—have engaged us in a deterritorialization of local cultures, a disembedding and a move toward what could be called a “culture of global flows.”50 As argued above, the digital culture of telecommunications is also a world of the third nature, which is not restricted to the spatio-temporal grids of ← xxvi | xxvii → human experience. It is post-phenomenological, untied to the world of human perception even if it might govern it. Global cultural practices thus demand a global perspective for cultural analysis, which does not, however, mean a universalization and decontextualization: it just demands that we rethink what we mean by situated knowledge in media ecologies that are transnational, software-based, and consist of multiple levels of agency that are not merely human.

The idea of “media ecology” has, of course, been used in various ways. Traditionally it emerged from the work of Marshall McLuhan, Lewis Mumford, Harold Innis, Walter Ong, and Jacques Ellul, even if the conservative work of Neil Postman has become a key inheritor of the term. As Michael Goddard describes Postman: “In both Amusing Ourselves to Death (1987) and the more recent Technopoly (1993), Postman adopts a form of populist technophobia that only seems to maintain from McLuhan his anecdotal style and love of metaphor and whose only antidote to the Behemoth of technological domination seems to be a quite conservative notion of pedagogy.”51 But media ecology cannot be reduced to Postman, and even the Toronto-based scholars of McLuhan and company had a richer notion of media. The radical advantage in their take was a strongly historical notion of media where perception became imbued in its material realm. With the Toronto school of media ecology, aesthetics came to be about not the object perceived but the spatio-temporal conditions, technological framing, and media distribution of ways of perception. The later European radical media ecology stems from the work of Matthew Fuller, which imports a range of alternative scholars into the discussion, including Kittler, Guattari, and N. Katherine Hayles. For Fuller, using the term “ecology” to describe media phenomena is justified as it indicates “the massive and dynamic interrelation of processes and objects, beings and things, patterns and matter.”52 As Goddard emphasizes, Fuller’s way of handling media ecology pluralizes it and teases out the Guattarian potential to see it as a way to tap to the various ethico-aesthetic and politically significant experiments in media technological culture. It is a way to situate processes that are technologically abstract but concretely part of the political determination of our situation and subjectivity.

Media ecologies consist of concrete machinic assemblages and abstract machines. Technology is in such a perspective understood as an “assemblages” (agencements in French) of heterogeneous parts, consisting of organic and nonorganic, significatory as well as a-significatory entities, with differing durations and overlapping histories. Assemblages can perhaps be understood as concrete stabilizations on a machinic phylum of media ecology: ← xxvii | xxviii →

We will call an assemblage every constellation of singularities and traits deduced from the flow—selected, organized, stratified—in such a way as to converge (…) artificially or naturally. (…) Assemblages may group themselves into extremely vast constellations constituting “cultures,” or even ages. (…) We may distinguish in every case a number of very different lines. Some of them, phylogenetic lines, travel long distances between assemblages of various ages and culture (from the blowgun to the cannon? From the prayer wheel to the propeller? From the pot to the motor?); others, ontogenetic lines, are internal to one assemblage and link up its various elements, or else cause something to pass (…) into another assemblage of different nature but of the same culture or age (for example, the horseshoe which spread through agricultural assemblages).53

This view on technology asserts that technology is “machinic” and based on flows (although a machinic assemblage is never only technological).54 Machinic assemblages help us to think of the world as intertwined; the “things” connected (the linguistic concept of “virus” articulated to self-reproductive software programs, for example) do not precede the connections.55 And not all machines are concrete or resemble what we think of as technological machines. In this Deleuzian and Guattarian sense, some machines are more abstract, yet as real as the concrete, even technological ones. Connections are made across scales: biological terms are transported into technology, politics intertwines with aesthetics, diseases spread across economics.

The phenomenon of computer worms and viruses is formed of various machinations, concrete and abstract, that consist of technological, social, psychological, and economic parts. In short, it is irreducible to one determining factor or territory that moves across the corporeal and the incorporeal; the material and its various temporal articulations. The material, or corporeal, aspect has to be complemented by the incorporeal as conceptualized by Deleuze and the new materialist tradition of thought, which insists matter matters, is alive with life, and irreducible to the hylomorphic scheme of things separated from their meanings. The material is immanently pierced by the incorporeal and the discursive instead of a bifurcation of mind and matter. Even though the corporeal reality of a computer virus might stay the same, because it is a certain pattern of code that replicates, infects, and spreads, it can be understood very differently. To use Claire Colebrook’s example: “A knife may cut a body, but when we call this event punishment we create a whole new (incorporeal) world of morals, crimes, criminals, laws and judgments.”56 This type of conception of language emphasizes pragmatics over semantics, underlining the order-word nature of acts of language. Language is not merely communication or about thinking but a force that materially ← xxviii | xxix → connects rhizomatically with its outside.57 Material processes have their own duration that is not reducible to signification, but at the same time acts of order-words impose actual transformations in terms of categories, definitions, and events. Deleuze and Guattari refer to the incorporeal transformation of an airplane (the plane-body) into a prison-body in a hijacking situation, where the transformation is enacted by the “mass media act” as an order-word.58 By way of different transformations, a computer virus has been turned, in various assemblages of enunciation (such as mass media acts), into malicious software, a security problem, but also a piece of net art, an artificial-life project or a potentially beneficial utility program. This logic also organizes my sections and chapters into the three themes of “security,” “body,” and “life,” which all might sound general, vague, and abstract but actually pinpoint specific histories under those rubrics.


In reference to my title, Digital Contagions, “contagions” has a double role here: to present the contagions induced by computer worms and viruses, and to produce new types of contagions of thought within cultural theory and media history. Contagions are thus to be understood in the sense Michel Serres gives to parasites: parasites are not actually disconnections of communication, but thirds-excluded that guarantee the interconnectivity and movement of a system. Communication cannot go on without the element of the third-excluded mediator, of contagion, of even miscommunication at its center: communication is consists of both signals and noise, just as the classical communication theory of Shannon and Weaver taught.59 Never just dismiss the proverbial or literal noise. The task is to engage the interference, not just as an engineering element to be reduced but as a key trait within the network societies of digital culture, a trace to be followed, a tendency to be thought with. Contagion is a jump cut, an open-ended system, an experiment.60 What has to be noted early on, however, is how the theories and theoreticians used here are part of the same field as the phenomenon of computer viruses and the culture of virality (comprising, for the most part, continental theories of post-structuralism and radical difference of the 1960s to 1990s). Just as various ideas in computer science, cybernetics, and, for example, ecological thought were wired into the philosophies of Lacan, Derrida, Foucault, and Deleuze and Guattari, so their ideas concerning systems, aesthetics, and, for example, politics have had an ← xxix | xxx → impact on the digital (counter)culture in Europe and the United States since the 1980s.61 Instead of being a problem, this has further enticed me to think with Deleuze and Guattari—their writings function as a form of nonlinear ecological system, relying on forces of variation that can act as a trigger of becomings, new modes of thought and action. Theories are not used as direct applications of an explanatory grid to a material but as ways of cultivation. They are different, more conceptual ways of articulating and mapping the contours of the viral ecology of late twentieth-century computer culture.

My general framework is to move in patterns and vectors outlined by Kittlerian material analysis but with the Deleuzian twist of embracing incorporealities and multiplicities. My theoretical references in this work should primarily be read in relation to this bipolar theoretical phasing between Kittler and Deleuze. Whereas Kittler brings in the technologization of Foucault and hence a necessary perspective on how to analyze material genealogies of technical media, Deleuze is important as a thinker of becomings. Their wedding in this work is obviously not the most frictionless one, but it can produce important ways to bring some Deleuzian movement into Kittler’s way of analyzing discourse networks, which otherwise might tend to become too rigid. Whereas Deleuze (with Guattari) is important in creating the ethico-aesthetic agenda of the work, a sort of a minoritarian cultural memory if I may, Kittler is suitable for a historically oriented analysis and synthesis of discourse networks. Hence, the discourse network acts as a media ecology and a machinic phylum, consisting of incorporeal acts of ordering and corporeal movements of, say, hardwares, softwares, networks, people, and architectures. Kittler underscores that media in complex ways determine our situation and make possible what it is to state and think in a certain age; in contrast, Deleuze, with and without Guattari, tends to think in terms of movements and assemblages. Technologies are always part of social assemblages that select and employ them for particular uses and functions. Media technologies can be thought to occupy a certain phylum and virtual tendencies, but they are always situated as part of even larger and heterogeneous assemblages, which also helps us address issues of politics and agencies.62 It is through dispositifs of media technological power that what exists and what is true become defined. Power, then, is the primary enactor of cultural techniques and functions as the filter through which cultural discourses and nondiscursive practices emerge.

Tracking and mapping multiplicities is a key focus of this work, and the task is reflected also at the methodological level. As noted above, this work is a meshwork of a number of elements, instead of being an application of any ← xxx | xxxi → clear-cut theoretical idea or methodological guideline. In this task, I follow Foucault’s general method, or ethos, of eventualization. In an interview published in 1980, Foucault outlined his views discussed in the earlier genealogical papers. An analysis and production of events aims to make visible singularities “at places where there is a temptation to invoke a historical constant, an immediate anthropological trait, or an obviousness that imposes uniformly on all.”63 For me, this produces an important impetus with which to break out from the hegemony of representation analysis. Analyzing representations can be defined as focusing on how cultural identities are reproduced (and resisted) in cultural discourses from mass media to everyday life. It is an attempt to focus on the productive power that moulds identities and produces cognitive and affective ways of being. Such an analysis can be seen as successful in its underlining of the dynamics of subject formation as part of representational grids, but recent years have also seen a critique of its insufficient approach toward the immanent materiality of culture.64

In a Deleuzian–Spinozian vein, media philosopher Luciana Parisi calls for new modalities of thought that would go one step beyond representational analyses to encompass a more material level of engagement. Representational cultural analysis ends easily, merely reproducing the same grid on which the entities are positioned. Critique risks solidifying the objects of the critique. According to Parisi, critical theories from semiotics to structuralism and post-structuralism do concentrate on determinants (“a semiotics of signification, bodies’ positions in a structure, ideological overdeterminations”65) but are not able to take into account the material ethics of becoming. Thus, a focus on assemblages (Deleuze) or mixed semiotics (Guattari) articulates how signifying discourses and material a-semiotic encodings always act in pacts. Parisi’s point, following Deleuze and Guattari, and Spinoza, is, then, that reality is not “exclusively constituted by significations, discourses and ideologies,”66 and hence critical analysis of reality should methodologically focus not merely on reflections or representations of reality but also take reality itself as under construction.67

Following such ideas, Digital Contagions moves in the direction of thinking of the outside, the event as radically new, the unexpected—and not only mapping representational and symbolic orders and engaging in a politics of representation. As eventualization, this project challenges the assumed uniformity that surrounds this special class of software. Not merely malicious software made by juvenile vandals, but an inherent part of digital culture, computer worms, and viruses become much more interesting than only as security risks. Hence, ← xxxi | xxxii → the task is not only to reproduce the banal fact that viruses are represented as malicious but also to challenge this one-sided approach and to produce singularities that would open up novel fields of understanding. This attributes an active interpretive or, more accurately, connective role to the writer. As stated above, the writer connects, works with, materials to summon or conjure forth worlds that have been neglected, frozen. The analyst, or media archaeologist, eventualizes, which in Foucault’s wake means “rediscovering the connections, encounters, supports, blockages, plays of forces, strategies, and so on, that at a given moment establish what subsequently counts as being self-evident, universal, and necessary.”68 Media archaeology should not only track the majoritarian understanding of the discourses and dispositifs of digital culture but also aim to follow the detours and experiments that remain virtual, yet real, in the shadows of the actuality of hegemonic understanding.

The accidental is not merely an object of research but points toward this method that has also a resonance with Deleuze’s thought. In Difference and Repetition, he categorizes the questions “How much?” “How?” “In what cases?” and “Who?” as demands that restore multiplicity to ideas, and hence bypass the ethos focused on stable unmoving essences.69 This can be understood as a Nietzschean questioning of genealogy, as further developed by Deleuze and Foucault, where thought and representations are revealed as posited in and of violence, power. This type of genealogical perspective on accidental ontology admits that any position is unstable, situated in relation to others, and devoid of eternity. No knowledge (connaissance) is outside the will-to-knowledge (vouloir-savoir), a realization that leads not to relativism but to admitting that knowledge is imbued in power and hence always a system that is more or less unstable.70

History and the archive as a structuring logic of history form an essential part of “what we are and what we are ceasing to be,”71 acting as a form of depositary for possible actions and significations. Whereas the majoritarian central memory, or archaeological stratum, has so far devalued nomadic memories as secondary a-signifying practices, a Deleuzian ethos of memory emphasizes the constant need for deterritorialization of this central memory, but in a thoroughly affirmative manner. As Rosi Braidotti wonderfully writes, this is an “intensive, zigzagging, cyclical and messy type of remembering” that endures complexities, thus dislodging the established points of subjectivity from their places, opening up new sites and territories of acting and remembering.72

Braidotti also brings the force of imagination to the fore. Becoming minoritarian means engaging with the force of imagining “better futures.” A ← xxxii | xxxiii → nomadic memory of zigzag territorialization aims at actualizing “virtual possibilities which had been frozen in the image of the past.”73 Memory is conjoined with a creative imagination to bring forth a new, monstrous future of nomadic subjectivities that are resistant to the majoritarian modes of affecting, perceiving, thinking, and remembering. Historical analysis is also stretched toward the future. In general, cultural analysis is a matter of how to fold the outside, the event (accidentum), the unpredictable into effective and affective assemblages that make a difference—in other words, how to come up with a different future. As Grosz notes, this problematics is inherently tied to the question of time and to bringing forth such a form of duration that is not committed to continuous growth but to “division, bifurcation, dissociation” and difference.74 This is a theoretically rewarding challenge and something that can also act as guidance to thinking about technology and the alternative genealogies that can inform both a fresh understanding of digital culture and also new methods for humanities. This is also the task of this book as well as the other two that form the so-called media ecology trilogy alongside this one: Insect Media and A Geology of Media. But first, viruses and software.


1.Virilio & Kittler 1999, 84.

2.Stephenson 1993, 371.

3.Schivelbusch 1977. Lundemo 2003.

4.Quoted in Sterling 1994, 12.

5.McNeill 1998, 170–178, passim. Vuorinen 2002. DeLanda 2003, 242.

6.Kittler 1997, 55.

7.Clark 1997, 79.

8.Schivelbusch 1977, 118–119.

9.McLuhan 1962.

10.A focus on accidents opens up a new perspective on media technologies. See, e.g., Virilio 1993, 212. On the cultural history of accidents, see also, e.g., Schivelbusch 1977. Trond Lundemo (2003, 25) has analyzed the ideas of uncontrollability and danger inherent in media technologies, demonstrating how “the decomposition of the machine and the erasure of memory,” in particular, are key themes considering the age of digitality. Ontologically invisible digital technology reveals itself only in the event of breaking down, which gives accidents a special status in a cultural sense. Every media ecology or discourse network seems to have an accident of its own, and this work aims to highlight the position computer worms and viruses have in relation to the network culture of the late twentieth century. They reveal technology, and the power/knowledge relations that media are embedded in. These ideas stem originally from Heidegger’s notions of the ontology of Being. See ← xxxiii | xxxiv → Heidegger 1996, §16. Viruses and worms have been analyzed only partially and/or fleetingly from a cultural perspective, whereas technical and practical antivirus manuals and books have been abundant ever since 1988. For useful sources on the cultural contexts of computer worms and viruses in relation to the Internet, the AIDS phenomenon, the rise of the digital culture, and cyber risks, see Ross 1990; Lupton 1994; Saarikoski 2004, 360–377; Van Loon 2002a, 147–168; Sampson 2004, 2009, 2012; Galloway 2004, 176–184; Mayer & Weingart 2004a; Thacker 2005; Bardini 2006; O’Neil 2006. See also my articles Parikka 2005a, 2005b, 2005c, 2007; and Parikka & Sampson 2009.

11.Fuller 2005, 63.

12.John Johnston explains Friedrich Kittler’s term “discourse network” to mark the archival condition of a particular age, or more accurately the potentials of inscription: “by a culture at a particular moment in time.” Johnston continues: “The notion of the discourse network points to the fact that at any given cross-sectional moment in the life of a culture, only certain data (and no other) are selected, stored, processed, transmitted or calculated, all else being ‘noise.’ (…) In other words, on the basis of this particular selection of data not only perceptions, ideas, and concepts—all that is coded as meaningful in short—but also a system authorizing certain subjects as senders and others as receivers of discourse is instituted.” Johnston 1997, 9–10. Cf. Kittler 1990, 1999. The term “discourse network” is thus not the best translation of Aufchreibesystem, “systems for writing down,” for engraving or for recording. Kittler has attempted to broaden this perspective to encompass more technical information channels as part of the archival layer, or discourse networks. As Kittler puts it, even if all books are discourse networks, not all discourse networks are books: “Archeologies of the present must also take into account data storage, transmission, and calculation in technological media.” Kittler 1990, 369. I use the concept of discourse throughout this work, but it should be read as stemming from a materialist understanding of discourse networks. Foucault underlines the need to see discourses and nondiscourses (material practices) as inherently intertwined. Discourses are immanent to, for example, habits, operations, spaces, and practices that produce the world—cultural techniques in the material sense. Discourses are about producing certain effects and affects, implying that the concept is not to be taken as solely textual. Deleuze insists that Foucault is not to be understood as an archaeologist of linguistic statements alone. Foucault’s focus is on the relations of the visible and the sayable, which form the space of “the audiovisual archive.” Historical dispositif consists of architectural, technological, and philosophical or conceptual elements. Deleuze 1998. See also Rodowick 2001, 49–54.

13.Cf. Terranova 2004, 98–100.

14.Compared to the earlier archaeological analyses, the genealogical ideas bring in an important emphasis on writing counterhistories. See Gutting 1989, 271. Several media archaeological theories are more accurately a combination of the genealogical and the archaeological. See, e.g., Zielinski 2006. I want to emphasize how the archaeological analysis of a priori conditioning of statements, objects, and processes (a certain form of Kittlerian view of discourse networks) should be tied intimately with a commitment to producing new genealogies. Here, as Foucault himself noted in his Berkeley lecture in 1983, genealogy becomes a mode of inquiry into how archaeological events condition the present, how the archaeological a priori continuously translates into our contemporary condition. Foucault 1994. Cf. Kittler 1990. Siegfried Zielinski (1999) positions ← xxxiv | xxxv → his archaeological quest in three intellectual traditions: (1) Raymond Williams’s idea of culture as a way of life and technologies as cultural practices; (2) systems-theoretical approaches where technology is considered as unities of origination/production and use; (3) the metapsychological film theories of Jean-Louis Baudry, Jean-Louis Comolli, and Christian Metz, where the concept of apparatus is developed. On media archaeology, see also Huhtamo 1997, Elsaesser 2004, Huhtamo & Parikka 2011, and Parikka 2012.

15.See Deleuze 1990. In a manner reminiscent of Larry Grossberg’s radical contextualism, I understand context not as the background of a study but as the essence of what a study tries to map, the “very conditions of possibility of something.” See Grossberg 2000, 41. Context understood in this way draws from a Deleuzian–Nietzschean understanding of forces as the processes of differentiation that create the world. See Deleuze 1986.

16.This connects to the conceptualizations of time in cultural historical research promoted by, for instance, Fernand Braudel as well as the remediation thesis of Bolter and Grusin (2000). For instance, Braudel (1997, 205) notes how every actuality is composed of multiple durations and variable rhythms that coexist.

17.Cf. Longley 1994, 589–590.

18.Virilio 1993, 212. On Virilio and his notions of accidents, see Redhead 2004.

19.Scott Berinato: “Attack of the Bots.” Wired, vol. 14 issue 11, November 2006, <>.

20.Brunton 2013.

21.Chun 2015.

22.See “Hackers Remotely Kill a Jeep on the Highway—With Me in It.” July 21, 2015, Wired, <>.

23.“Iranian hackers broke into what they thought was a Chevron gas pump—but it was a honeypot.” August 13, 2015, Fusion, <>.

24.Brunton 2013, 112.

25.Malicious software can be defined generally as software that intentionally and without consent damages software or computer systems. It differs then from software bugs that are unintentional.

26.Castells 2001, 172. Cf. Campbell-Kelly & Aspray 1996, 283–300.

27.Castells 2001, 172–173. Hardt & Negri 2001. Castells refers also to the libertarian social movements of the 1960s and early 1970s in Europe and the United States as influential in this context.

28.Hardt & Negri 2001, 298. See also Hardt & Negri 2001, 32. Urry (2003, 9) emphasizes how global power should not be conceptualized as a substance of sorts, but an emergent complex process. The Empire should not, then, be understood as a hierarchy of power but as part of an emergent networked process that interconnects local issues with global vectors.

29.John S. Mayo, “Materials for Information and Communication.” Scientific American, vol. 255, October 1986, 51.

30.The entanglement of media and catastrophe is, of course, not a novel theme, as exemplified, for example, by Mary Ann Doane (2006) in her analysis of the televised catastrophes that break the temporal and spatial habits of everyday life in a mediatized society.

31.See Gerrold 1975. Brunner 1976. Ryan 1985. Beeler 1973. Latva 2004.← xxxv | xxxvi →

32.See Parikka 2007.

33.See Sontag 2002.

34.“Hex” refers in computers to “hexadecimal,” a specific way of marking the binary code in a base-16 system.

35.Cf. Deleuze & Guattari 1987, 66, 504. Wise 1997, 63. Following Wise’s Deleuzian–Guattarian reading, technology is here seen as enacting corporeal materiality and being always entwined with the incorporeality of language (as the order-word). Slack & Wise 2002, 495.

36.Cohen 1994. See also Cohen 1991b.

37.See Ludwig 1993. Cf. Vesselin Bontchev: “Are ‘Good’ Computer Viruses Still a Bad Idea?” EICAR Conference Proceedings 1994, 25–47. Julian Dibbell: “Viruses Are Good for You.” Wired, vol. 3, issue 2, February 1995.

38.Wark 1994, 120.

39.Sontag 2002, 177–178.

40.Cf. Latour 1993, 51–55.

41.Terranova 2004, 67. In more practical context, this essential of “parasitic computation” as part of the Internet infrastructure was articulated in 2001 by a bunch of scientists. Albert-László Barabási, Vincent W. Freeh, Hawoong Jeong, & Jay B. Brockman: “Parasitic Computing.” Nature, vol. 412, August 30, 2001, 894–897.

42.See Harley, Slade, & Gattiker 2001, 87–88. Since February 2016, has hosted the Malware Museum of primarily MS-DOS viruses and other examples from the 1980s and 1990s. They are emulated online and one can find examples of the different payloads of the viruses.

43.In this study, I use the term “virus” as a generic one, referring to “worms” only when required to emphasize something. This is also the general way these programs are discussed in popular media. See the Wikipedia entries on “computer virus” and “malicious software,” <>.

44.See Cohen 1991b, 1994. See also John F. Shoch & Jon A. Hupp: “The ‘Worm’ Programs—Early Experience with a Distributed Computation.” Communications of the ACM, vol. 25, issue 3, March 1982, 172–180.

45.As Kittler (2001, 14–19) notes, cultural analysis, or Kulturwissenschaft, is in its modern form fundamentally cultural historical analysis.

46.As Wolfgang Ernst (2002) underlines, archives are not traces of a continuous past life; they are by nature fragmentary monuments embedded in media technological networks. The media archaeological task is, then, to write discontinuous links between such archival events. Archaeology does not write of things behind the archive, and it does not attempt to restore a way of life of human beings. Instead, it focuses on the arche, the technological conditions of existence, of culture.

47.On cultural analysis, see Bal 1999, 1, 12–14. See also Deleuze & Guattari 1987, 3–25. Walter Benjamin’s concept of cultural history rests on a similar idea of historical materialism as a mode of thinking that proceeds with the past. Historical “objects” are thus no mere objects of thinking but participants in historical modes of thought. See Caygill 2004, 90–91, 94–95. Benjamin’s stereoscopical historical research aims at combining the images of the past with the contemporary moment to create a critical constellation. It taps into tradition to find cracks that are to be transported as part of a creation of a novel ← xxxvi | xxxvii → future. See Buck-Morss 1991, 289–292. This type of discontinuous view of history resonates with Foucauldian themes of archaeology.

48.Cf. Massumi 1992, 46.

49.Munster and Lovink (2005) argue “against biologism.”

50.See Castells 1996. Cf. Beck 2002. Beck argues for a cosmopolitan agenda and theory of social research.

51.Goddard 2011.

52.Fuller 2005, 2.

53.Quoted in Delanda 1991, 140. Cf. Deleuze 1997b, 185–186.

54.Following Deleuze (1998, 39–40), machines are not solely technical machines but also social. The social machine that does not refer only to human societies—selects and assigns the technical elements used. In a Deleuzian–Guattarian ontology, the seemingly solid objects of technology, or of culture, such as identities, sexualities, gender, and institutions, are only a consequence of a more underlying flux. The world is about connections between these flows, and in these connections seemingly solid objects form: technologies, humans, animals, cultural artifacts, and natural things. The solid objects are the molar entities of culture, whereas movement (as change) happens on a molecular level (Murphie & Potts 2003, 30–35; Deleuze & Guattari 1987). Flows do not emanate from individuals, but instead individuals are formed at the intercrossings of cultural flows, as slowing-downs of movements. Individuals are always part of synthetic (machinic) assemblages, which consist of a partial conjoining of heterogeneous flows. This Deleuzian–Guattarian view differs from the more structuralist (and Marxist) versions, where individuals are determined by the structures. Individuals are not effects of an underlying power structure, whether economic, linguistic, or psychic, as some psychoanalytic accounts might suggest, but overdetermined sites of power and knowledge, not reducible to one type of power relationship. In other words, as Foucault reminds us, where there is power, there is also counterpower, implying the constant dynamics going on in cultural discourse networks. Deleuze conceptualizes this as the primacy of lines of flight: a society is defined by the lines that escape the molar machinations, not the stable entities (Deleuze 1997b, 188–189). In Foucault’s terminology, the issue is about cultural techniques, where knowledge is in itself a technique for organizing, participating, and delineating the flows (Murphie & Potts 2003, 30).

55.Murphie & Potts 2003, 30–35.

56.Colebrook 2002, 119.

57.The materiality of the “object” of research is summoned by my understanding of how texts function as part of their surroundings. As a major component of my sources is written material, this would easily imply that the tracings I make are merely “symbolic,” or semantic signs of meaning (signified). Nonetheless, the rhizomatic stance toward texts, as a supplement to representational analysis, feeds on Deleuze and Guattari’s notions from the first chapter of A Thousand Plateaus, where they emphasize that texts are not to be (merely) interpreted, nor are they images of the world, but, more accurately, they work within the world. The linguistic model of the world is too restricted to account for the connecting of language “to the semantic and pragmatic contents of statements, to collective assemblages of enunciation, to a whole micropolitics of the social field” (Deleuze & ← xxxvii | xxxviii → Guattari 1987, 7). In cultural history, the reception of Foucault can be deciphered as the significant distinction. Whereas such Anglo-American writers as Peter Burke (2004, 76) see Foucault as part of the linguistic turn and as a thinker of the discursive, Roger Chartier (1997, 68–69), for example, has, rightly in my opinion, criticized this one-sided notion and highlighted the necessary articulation of the discursive with the nondiscursive. This is what Foucault clearly states in his The Archaeology of Knowledge. See Foucault 2002. I approach texts as machines of production, as creators of effects, affects, and thoughts that intertwine with nondiscursive planes of culture, and in this they are always multiplicities that cannot be reduced to their majoritarian, hegemonic uses and interpretations. There is always potential for some more, some new connection.

58.Deleuze & Guattari 1987, 80–81.

59.Shannon & Weaver 1949. See Serres 1982. Brown 2002. Kittler 1993, 168.

60.See Massumi 2002, 18–19.

61.Gere (2002) has noted how post-structuralism has contributed strongly to the same discourse and our understanding of digital culture. There has, since the 1960s, been a constant feedback between issues technological, theoretical, and cultural, which is why there is a constant strong resonance between issues of digital culture and, for example, certain strands of “French theory,” such as that of Derrida or Deleuze and Guattari.

62.See Deleuze & Parnet 1996, 85. Cf. Kittler 1990. Also Matthew Fuller (2005, 60–62) notes that Kittler attenuates the political aspects of Foucault’s discourse thinking.

63.Foucault 2000c, 226. This approach resonates with certain themes of media archaeological research where the focus has been on similar minoritarian tracings and analytics of becoming. Zielinski 2006.

64.See, e.g., DeLanda 1999. Grosz 2005, 44–46. Terranova 2004, 8–9, 35. Wiley 2005. Parikka & Tiainen 2006. Dolphjin & van der Tuin 2012. Deleuze and Spinoza contrast with the Kantian tradition of cultural analysis, which can in broad terms be characterized as critical analysis of conditions of knowledge. Kant’s questioning for the conditions of knowledge that filter the world to us humans can be seen still as a dominant method of cultural analysis that sees the world as mediated by a priori categories (mental and cultural), which is why the world in itself is beyond our reach. (See Hallward 2006, 11–12.) The world is divided into spheres of the world-in-itself (noumena), the appearances and the subject, a division that has contributed to the representational emphasis of contemporary cultural analysis. In this mode of thought, the world is mediated by representations we have of it. Recent years of critique of correlationism have engaged with this dilemma, but it has also longer roots in feminist theory and post-structuralist philosophy. Elizabeth Grosz argues that the problem with representation-oriented cultural studies is that it restricts nature (and the material) to being the passive nonform that culture as an active force fabricates. Grosz argues that one should think of the material as the outside of the representational, as the event that produces the symbolic. Nature, the material, the event are the outside of the representational, the multiplicity that enables the emergence of a cultural order (Grosz 2005, 52, 220–221n4). In the Deleuzian–Spinozian approach to cultural reality, there is no primary division between noumena and appearances. Instead, reality is characterized by univocity. There is one voice for the world; all things are expressions of the same force of the world. The world is characterized by the fundamental immanence of processes and things, of relations traversing the whole. However, the ← xxxviii | xxxix → whole is not a uniform and determined essence, but a multiplicity, which leaves open the possibility of change, creation, and politics. In Deleuze’s view, the world does not consist of differences between representations but more fundamentally of a differing at the heart of the nature–culture continuum: a force of differing that endows the material world with the potential to change, irrespective of representations “attached” to things and relations (Cf. Hallward 2006, 11–26.)

65.Parisi 2004b, 74.

66.Ibid., 75.

67.Ibid., 84.

68.Foucault 2000c, 226–227.

69.Deleuze 1994, 188. This stance can be attached to the division between majoritarian and minoritarian readings of culture and history. See Goodchild 1996, 54–55. On cultural analysis as experimental perturbations, see Massumi 1992, 68.

70.Foucault 2000e, 387.

71.Deleuze 1992, 164.

72.Braidotti 2001, 187.

73.Ibid., 188.

74.Grosz 1999. It would again be interesting to connect such themes to a Benjaminian ethos of critical historical analysis. See Buck-Morss 1991.