The Universal Viral Machine

The Universal Viral Machine

Bits, Parasites and the Media Ecology of Network Culture

Jussi Parikka

“Organisms are adapted to their environments, and it has appeared adequate to say of them that their organization represents the ‘environment’ in
which they live […].”[1]

— Humberto Maturana

Prologue: The Biology of Digital Culture

During the past few decades, biological creatures like viruses, worms, bugs and bacteria seem to have migrated from their natural habitats to
ecologies of silicone and electricity. The media has also been eager to employ these figures of life and monstrosity in representing miniprograms,
turning them into digital Godzillas and other mythical monsters. The anxiety these programs produce is largely due to their alleged status as
near-living programs, as exemplified in this quote on the Internet worm of 1988:

The program kept pounding at Berkeley’s electronic doors. Worse, when Lapsley tried to control the break-in attempts, he found that they came faster
than he could kill them. And by this point, Berkeley machines being attacked were slowing down as the demonic intruder devoured more and more computer
processing time. They were being overwhelmed. Computers started to crash or become catatonic. They would just sit there stalled, accepting no input.
And even though the workstations were programmed to start running again automatically after crashing, as soon as they were up and running they were
invaded again. The university was under attack by a computer virus.[2]

Such articulations of life in computers have not been restricted to these specific programs, but they have become a general way of understanding
the nature of the Internet since the 1990s. Its complex composition has been depicted in terms of “grassroots” and “branching structures”, of
“growing” and “evolution.” As Douglas Rushkoff noted in the mid-1990s, “biological imagery is often more appropriate to describe the way cyberculture
changes. In terms of the way the whole system is propagating and evolving, think of cyberspace as a social petri dish, the Net as the agar-medium, and
virtual communities, in all their diversity, as the colonies of microorganisms that grow in petri dishes.”[3]

In this article, I examine computer worms and viruses as part of the genealogy of network media, of the discourse networks of the contemporary
media condition. While popular and professional arguments concerning these miniprograms often see them solely as malicious code, worms and viruses
might equally be approached as revealing the very basics of their environment. Such a media-ecological perspective relies on notions of
self-referentiality and autopoiesis that problematize the often all-too-hasty depictions of viruses as malicious software, products of vandal
juveniles. In other words, worms and viruses are not antithetical to contemporary digital culture, but reveal essential traits of the techno-cultural
logic that characterizes the computerized media culture of recent decades.

I place special emphasis on such functions of the past decades of digital culture as networking, automation, self-reproduction and copying and
communication. These terms have been incorporated both in the vocabulary of media culture, as well as in the practical engineering work performed by
computer scientists and other professionals who implement the principles of computing across the globe. As I have discussed the connection of computer
viruses and information capitalism elsewhere [4], the present text focuses more on the socio-technological
genealogy of the phenomenon, thus supplementing the work already carried out.

In 1994 Deborah Lupton suggested that computer viruses could be understood as metonyms “for computer technology’s parasitical potential to invade
and take control from within”[5], thus expressing the ambivalent reception — vacillating between anxiety
and enthusiasm — with which the computer has been greeted during recent decades. In a similar manner, I ask whether viruses are a metonymy, or an
index, of the underlying infrastructure, material and symbolic, on which contemporary digital culture rests. Whereas some biologists claim,
“[a]nywhere there’s life, we expect viruses,”[6], it seems to me that this can perhaps be extended to the
world of digital culture too. Mapping the (historical) territories computer worms and viruses inhabit produces a cartography of these effective pieces
of code that does not reduce them to the all-too-general class of malicious software but acknowledges the often neglected centrality such types of
programs have in the network ecology of digital culture. Such pieces of viral code show us how digital society is inhabited by all kinds of
quasi-objects and non-human actors, to adopt Bruno Latour’s terminology.[7] In this sense, artificial life
projects and the biological metamorphoses of the digital culture of recent decades provide essential keys to unravelling the logics of software that
produce the ontological basis for much of the economical, societal and cultural transactions of modern global networks.

The contemporary cultural condition is often described as an essential coupling of war and media — and the cybernetic logistics of command,
control, communications and intelligence, C3I — extended from strictly military networks to also include the entertainment media.[8] I suggest, however, that “life” and ideas such as “ecologies” and “territories” can also act as valuable
theoretical points of reference in understanding the paradigms of digital culture. Cybernetics, like other scientific origins of modern day digital
networks, also focus on life and coupling the biological with the technological, a theme that has won ground especially during the past few decades
along with an ever-increasing amount of semi-autonomous software. Instead of simple top-down design and control, we have more and more artificial yet
life-like processes of self-organization, distributed processing and meshworking — themes that, while key cultural symbols, are also real processes
underlying the media ecology of digitality.

Viruses and worms present themselves as culminations of these cultural trends, while also functioning as novel “tools for thought”[9] for a media theory that focuses on complexity and connectionism. Complexity theories have found their
niche within philosophy and cultural theory emphasizing open systems and adaptability. Similarly, theories that underline the co-evolution of the
organism and its environment also provide important points of view for studying digital culture, allowing thought to bypass object-subject dichotomies
and see this media cultural condition as one of continuous feedback and self-recreation. The ingenious realization of various projects of digital
culture has been that their understanding of “life” was based on self-reproduction and a coupling of the outside with the inside, a process of
folding. This essay follows this trace, and folds this theme with cultural theory concerning digital network culture. In short, even though the
aforementioned terms “life”, “ecology”, etc. are easily self-referential loops, or — as in other cases — formal models, I want to suggest a more
subtle idea. When discussing the “life of network culture”, it should not be taken as a form, but rather as movement and coupling in a similar manner
as Deleuze’s reading of Spinoza affirms:

“The important thing is to understand life, each living individuality, not as a form, or a development of form, but as a complex relation between
differential velocities, between deceleration and acceleration of particles.” [10]

This ecological perspective does not, then, rely on formal characteristics of life, but is a tracing of the lineages of the virtual machinic phylum
of digital network culture and also a tracing of the paths of organisms that move on this plane: a biophilosophy [11] or — genealogy of digital life. Hence, although the focus here is on genealogies of network culture, this mapping is done in
order to provide rewirings for futures and becomings, as the final part of this article will illustrate.

The Universal Virus Machine

Fred Cohen has become known as the pioneer who engaged in deciphering the potentialities in viral programs at the beginning of the 1980s. Cohen’s
1983 experiments on viruses have since become famous, and Cohen, then a Ph.D student in electrical engineering at the University of Southern
California, has been usually cited as the one who realized the potential dangers of viral programs.[12]
The “denial of services attacks” Cohen described and warned about have since been demonstrated a very feasible means of information warfare, a war
taking place on the level of digital coding — softwar(e) as a spy thriller from 1985 named it.[13]
Cohen illustrated this in a piece of pseudo-code meant to give an idea of what viral programming might look like in principle:

subroutine infect-executable:=

{loop:file = get-random-executable-file;

if first-line-of-file = 1234567 then goto loop;

prepend virus to file;

}[14]

Evidently, the stir about viruses and worms that arose at the end of the 1980s was due to the realization that this far-from-inert piece of code
might be responsible for the “digital hydrogen bomb”, as the cult 1980s cyberculture-magazine Mondo 2000 noted.[15] As the Cold War period’s anxiety over nuclear weapons seemed to be fading, computer miniprograms and malicious hackers proved
a novel threat.

Fred Cohen was not, however, thinking merely of digital guerrilla war but of life in general, of the dynamics of semi-autonomous programs,
highlighting that the two, war and life, are not contradictory modalities, in the sense that both are about mobilizing, about enacting. In this
respect, his work has also been neglected, and I am not referring to the objections his research received in the 1980s.[16] Instead of merely providing warnings of viruses, Cohen’s work and Ph.D thesis presented the essential connections that
viruses, Turing machines and artificial life-like processes have. We cannot be done with viruses as long as the ontology of network culture is
viral-like. Viruses, worms or any other similar programs that used the very basic operations of communicatory computers were logically part of the
field of computing. The border between illegal and legal operations on a computer could not, therefore, be technically resolved — a fact that led
to a flood of literature on “how to find and get rid of viruses on your computer.”

For Cohen, a virus program was able to infect “other programs by modifying them to include a, possibly evolved, copy of itself.”[17] This allowed the virus to spread throughout the system or network, leaving every program susceptible
to becoming a virus. The relation of these viral symbol sets to Turing machines was essential, similar to an organism’s relation to its environment.
The universal machine, presented in 1936 by Alan Turing, has since provided the blueprint for each and every computer there is in its formal
definition of programmability. Anything that can be expressed in algorithms can also be processed with a Turing machine. Thus, as Cohen remarks,
“[t]he sequence of tape symbols we call ‘viruses’ is a function of the machine on which they are to be interpreted”[18], logically implying the inherency of viruses in Turing machine-based communication systems. This relationship makes all
organisms parasites in that they gain their existence from the surrounding environment to which they are functionally and organizationally coupled.

Although Cohen was preoccupied with the practical problems of computer security [19], his work also
has more important ontological implications. Security against malicious software (and the danger of someone using them to wage war) was only one
component of computer viruses, expressed in the difference between the pseudo-code of

subroutine infect-executable:=

{loop:file = get-random-executable-file;

if first-line-of-file = 01234567 then goto loop;

compress file;

prepend compression-virus to file;

}

and

subroutine trigger-pulled:=

     {return true if some condition holds}

main-program:=

{infect-executable;

if trigger-pulled then do-damage;

goto next;}

Such pieces of pseudo-code have been used ever since in illuminating the general logic of how viruses work. The small difference between these two
examples demonstrates that the activities of viruses are not reducible to the potential damage malicious software is capable of inflicting on national
and international bodies of order, but the very logic of self-reproducing software proves a fundamental issue regarding, of course, the ontology of
viruses and the digital media culture of networking. Even if Cohen’s obvious point was to find models and procedures for secure computing — to
maintain the flow of information in a society — this task was accompanied with something of a more fundamental nature. Thus, basically, viral
routines were not confined to damage but enabled the idea of benevolent viruses as well: for example a “compression virus” could function as an
autonomous maintenance unit saving disk space.[20]. In a similar sense another experimenter from the
early 1990s, Mark Ludwig, believed that viruses were not to be judged solely in terms of their occasional malicious payloads but by the
characteristics that made it reasonable to discuss them as artificial life: reproduction, emergence, metabolism, resilience and evolution.[21]

This turns the focus to the virulence of virus programs. Being bits of code that, by definition, function only to infect, self-reproduce and
activate from time to time, it is no wonder that a number of computer scientists have been unable to view them as passive material but as of something
acting, spreading. Others have taken them to be examples of primitive artificial life in their capability to reproduce and spread autonomously (worms)
or semi-autonomously (viruses).

I do not want to address the question of whether worms and viruses are life as we know it, but underline that in addition to being an articulation
on the level of cultural imaginary, this virality is also a very fundamental description of the machinic processes of these programs, and of digital
culture in general. As a continuation to the theme of technological modernization, network culture is increasingly inhabited by semi-autonomous
software programs and processes, which often raised the uncanny feeling of artificial life as expressed, for instance, in the various journalistic and
fictitious examples describing software program attacks. This uncanny feeling is an expression of the hybrid status of such programs that transgress
the constitutional (in Latour’s sense of the word) boundaries of Nature, Technology and Culture. Whereas viruses and worms have come to be the central
indexes of this transgression for popular consciousness, artificial life projects have also faced the same issue. As transversal disciplines such as
ALife have for decades underlined, life is not to be judged as a quality of a particular substance (the hegemony of a carbon-based understanding of
life) but as a model of the interconnectedness, emergence and behaviour of the constituent components of a(ny) living system. Chris Langton suggested
in the late 1980s that artificial life focuses not on life as it is, or has been but on life as it could be. This is taken up as the key idea for
projects that see life emerging on various synthetic platforms, silicon and computer-based systems and networks for example. [22] In a similar vein Richard Dawkins, when he viralized cultural reality with his theory of memes in
1976, referred to the possibilities of finding life even in “electronic reverberating circuits.” [23]

Consequently, a more interesting question than that of whether some isolated software programs are alive is to be found in the issue of what kind
of novel approaches the field of artificial life can provide for understanding digital culture. Artificial life might at least provide us with an
approach to think living systems not as entities in themselves, but as systems and couplings — here Thomas S. Ray’s Tierra-virtual ecology from the
1990s provides us with a good example. [24] This ALife-approach might also lead us to think of the
contemporary media condition as an ecology of a kind, of “living” in the sense that it is based on connectionism, self-reproduction, and couplings of
heterogeneous elements. This also resonates with the above-mentioned Spinozian understanding of life as affectivity: relations of varying velocities,
decelerations and accelerations between interconnected particles.

What Cohen established, and this might be his lasting contribution even if one does not want to downplay his achievements in computer science, was
the realization that digital culture was on the verge of a paradigm shift from the culture of Universal Computing Machines to Universal Viral
Machines. This culture would no longer be limited to the noisy capabilities of people designing the algorithms. Instead, these evolutionary concepts
of computing provided a model for a digital culture that increasingly relied on capabilities of self-reproductive, semi-autonomous actors. To quote
Cohen’s all-too-neglected words on “viral evolution as a means of computation” that crystallize the media ecology of networking digital culture:

Since we have shown that an arbitrary machine can be embedded with a virus (Theorem 6), we will now choose a particular class of machines to embed to
get a class of viruses with the property that the successive members of the viral set generated from any particular member of the set, contain
subsequences which are (in Turing’s notation) the successive iterations of the “Universal Computing Machine.” The successive members are called
“evolutions” of the previous members, and thus any number that can be “computed” by a TM [Turing Machine], can be “evolved” by a virus. We therefore
conclude that “viruses” are at least as powerful a class of computing machines as TMs, and that there is a “Universal Viral Machine” which can evolve
any “computable” number.[25]

Code Environment

From an everyday perspective the question of technological evolution might seem oxymoronic, considering the violent intermingling of two such
different spheres as “biology” and “technology.” This issue has been thoroughly discussed ever since early cybernetics in the 1950s, and the
articulations of biology and technology continue to prove their operationality when understood as a questioning of the dynamics of technology. As
Belinda Barnet notes in her essay on the question of technological evolution and life, what is at hand is the need to grant “the technical object its
own materiality, its own limits and resistances, which allows us to think technical objects in their historical differentiations.”[26]

Barnet’s agenda connects to my articulation of a media ecology. Computer worms and viruses, as well as other technical elements of digital culture
for that matter, are not reducible to the discourses or representations attached to them, and in order to understand the complex nature with which
they are intertwined in the material cultural history of digitality, one must develop alternative concepts and approaches. In this problematic, “life”
and “dynamics” seem to resonate together in a manner proposed by complexity theories that value the processual nature of (open) systems based on the
ongoing feedback loop between an organism and its environment. However, since these notions easily remain vague metaphors, they need to be addressed
more thoroughly in order to amplify their implications for contemporary media ecology. Here I will approach the issue via a reference to the way
Deleuze and Guattari have outlined the issues of the machine (as separated from technologies themselves) and machinic ontology as interconnective and
interactive. That is, media ecologies can be understood as machinic processes based on certain technological and social lineages that have achieved
consistency. Machinic thus refers also to a production of consistencies between heterogeneous elements.[27] In such an ontology of flow, technological assemblages are partial slowing downs of flows into more discrete functional
entities. There are no humans using technologies, nor are there any technologies determining humans, but a constant relational process of interaction,
of self-organization, and hence the focus is moved to “subjectless subjectivities”.[28] In this sense,
the life of media ecology is definable as machinic.

Life as connectionism, not as an attribute of a particular substance, has been at the centre of viral theory as well:

The essence of a life form is not simply the environment that supports life, nor simply a form which, given the proper environment, will live. The
essence of a living system is in the coupling of form with environment. The environment is the context, and the form is the content. If we consider
them together, we consider the nature of life.[29]

I would like to especially emphasize the coupling of an entity with its environment as the essence of what constitutes “life.” This has a very
important implication. As scientists who have tackled the idea of computer viruses as artificial life have already noted, it is difficult, or perhaps
even impossible, to fully adopt computer viruses under the criteria of (biological) life. If we take an entity and a list of the qualities it should
display (reproduction, emergence, metabolism, toleration of perturbations and evolution), then nothing other than traditional life will succeed in
meeting the criteria for life.[30] I want, however, to take the suggestions for viewing life and
artificial life in terms of machinic connectionism as horizons and experimental ideas with which to think the contemporary media ecology.

Hence, viruses — and non-organic life in general — should be viewed as processes, not stable entities. Viruses, by definition, are machines of
coupling, of parasitism, of adaptation. Admittedly they might not be “life” as it is defined by everyday usage or a general biological understanding,
but yet they are spectres of the media ecology that invite us to take them as, at least, “as-if-life.” Considering a virus as an infection machine, “a
program that can ‘infect’ other programs by modifying them to include a, possibly evolved, copy of itself”[31], signifies the impossibility of focusing on viruses per se, and demands that we take a wider cultural perspective on these
processes of infection. As part of the logical circuits of Turing machines, viral infection is part of the computer architecture, which is part of the
technical sphere and genealogy of similar technical media machines, which in turn connect to lineages of biological, economical, political, and social
nature, and so forth. Viruses do not merely produce copies of themselves but also engage in a process of autopoiesis: they are building themselves
over and over again, as they reach out to self-reproduce the very basics that make them possible, that is, they are unfolding the characteristics of
network culture. In this, they are machinic subjects of a kind.[32] This viral activity can be
understood also as the recreation of the whole media ecology, reproduction of the organizational characteristics of communication, interaction,
networking and copying, or self-reproduction.[33] This is where I tend to follow Maturana and Varela and
their idea that living systems are part and parcel with their surroundings and work towards sustaining the characteristics and patterns of that
ecology. They occupy a certain niche within the larger ecology: “To grow as a member of a society consists in becoming structurally coupled to it; to
be structurally coupled to a society consists in having the structures that lead to the behavioral confirmation of the society,” [34] writes Maturana.

“Infections” or couplings were part of the genealogy of digital culture even before the 1980s in the form of John von Neumann’s automata, which are
often marked as the ancestors of modern day worms and viruses. Von Neumann engaged deeply in automata theory, automata referring here to “any system
that processes information as part of a self-regulating mechanism.”[35] Automata capable of reproduction
included logical control mechanisms (modelled on the McCulloch-Pitts theory of neurons) together with the necessary channels for communication between
the original automaton and the one under construction as well as the “muscles” for enabling the creation. This kinetic model of automata was soon
discarded, however, as it proved to be hard to realize: a physical automaton was dependent on its environment for its supply of resources and
providing it with such an ecology proved to be too cumbersome. Thus, with advice from his friend Stanislav Ulam, Von Neumann turned to developing
cellular automata, formal models of reproductive systems with “crystalline regularities”.[36] One of the
models for formal self-reproductive patterns was the very primitive living organism bacteriophage.[37]

Nature, in the form of characteristics of simple organisms, became interfaced as part of these formal models for computation. Cellular automata as
two-dimensional cell tables, with each cell being a finite automaton of its own, its state determined by the states of its neighbouring cells, were to
be understood as neuronlike structures. Once put into action, the automata seemed to take on a life of their own, as demonstrated in the 1970s by John
Conway at the MIT laboratories with his version, symptomatically called “Life.” These were essentially coupling machines, bounded however by their
formal characteristics as part of a two-dimensional habitat. While a single cell could not be thought to be alive in any sense of the word, the whole
system, which was in constant interaction, seemed to contain remarkable powers of calculation and emergence.

Such ideas, which became part of complexity theories, underscored the necessity in understanding the processual nature of (computational) life:
formal mathematical models, computers and perhaps the ontology of the world as well were based on forms of interaction between quasi-autonomous units.
This relates to the need to emphasize that even if modern digital culture, in the archaeology fastened to the importance of World War II and the
military origins of cybernetics, computers and networking, is inherently employed as a technology of death, there is also another thematics, up to now
neglected, that issues computers a role in the diagrams of life.[38] In addition to military contexts,
underlying, for instance, von Neumann’s and Wiener’s work, there exists also the striving for the “design of relatively simple simulacra of organic
systems in the form of mathematical models or electronic circuitry.”[39] Such aspects should lead us to
bring forth new genealogies of computing for the contemporary media condition. These perspectives should furthermore complexify our notions of the
history of viruses and virallike programs, as well as lead us to rethink some basic assumptions concerning the contemporary culture of technology,
which is increasingly modelled and designed as a complex, interconnecting ecology.

But, considering the “nature of digital culture”, are these lineages to be seen as metaphors that guided the research done at computer
laboratories, or could the interconnection of life (or at least the science of life, biology) and technology be more fundamental? Instead of
restricting the design work to the level of the metaphoric and language, one could also speak of the diagrammatics of computer design, piloting the
research and implementation done. The research on biology and computers was coupled, both infected by each other during the latter half of the
20th-century so that the human being and nature in general were increasingly understood as informatics (especially so with the boom in
DNA-research) and informatics were infiltrated by models adopted from brain research and, later, from ecological research. Thus, as Von Neumann
himself thought, designing computers was a matter of designing organs and organisms[40], that is,
machines that could function semi-independently as natural beings. Nature became the ultimate imaginary reference point for digital culture, not so
much a mirroring but an active interfacing of the technological and the biological.

What I want to emphasize is that this interfacing is not solely linguistic, we should not talk merely about the metaphorics of computer culture (as
a cultural studies perspective so often does), but see the biology of computers also as organizational in that a certain understanding of biological
organisms and ecological patterns and characteristics of life is entwined as part of the design and implementation of digital culture. [41] In this sense, the cultural theory of digital culture could also turn to biology as an aid, and
interface with, for example, Maturana and Varela’s notions of autopoietic living machines where the component is structured as a functional part of
the ambiance. As Guattari notes in Chaosmosis, this idea could be applied to an analysis of social machines as well — and hence to analyzing
the social machine of network culture, or the media ecology of networking. [42] The parts feed the
structuring, while themselves being fed from the whole. Yet, the difference between mere mechanical repetition and creative living systems that
Guattari notes [43] is an important one — which I will return to later with a discussion of the
virtuality of the living system.

Distributed Life Processes

To repeat, computer viruses are machines in the Deleuzo-Guattarian sense of the word in that they are connection-makers, reaching out and beyond
their seeming borders in order to find functional couplings. In a restricted perspective, this means that they couple themselves to files they
infect; by widening our horizon we see, however, that these couplings are inherently connections at the level of the Turing machine, that is, the
architecture of the computer in general.

The ideas of coupling and biological thinking in computing gained consistency especially during the 1970s, when several network projects started to
bloom. ARPANET (1969) was the pioneer, of course, but several others followed. Networking meant new paradigms for programming, as well as providing a
fertile platform for novel ideas of digital ontology. Viruses and worms were a functional element within this new trend of computing. Consequently,
the first archived real virus incident seems to be the Creeper-virus, which spread in the Arpanet-network in 1970. The Creeper was a utility program
made to test the possibilities of network computing. Inspired by the first program, written by the network pioneer Bob Thomas, several programmers
made similar virus-like programs.[44]

The worm tests made at the Xerox Palo Alto Research Center in the early 1980s were modelled on similar aspirations. As described by participating
researchers John Shoch and Jon Hupp, worm programs basically meant copying parts of the program to idle machines on the network. The problem, as
demonstrated by the Creeper, was of course that of how to control the spreading. Even the Palo Alto group experienced similar control problems when a
worm that was left running over-night “got away”: “The worm would quickly load its program into this new segment; the program would start to run and
promptly crash, leaving the worm incomplete — and still hungry looking for new segments.”[45]

However, the Palo Alto scientists designed these programs — “laboratory worms” of a sort — with useful goals in mind. The existential worm was a
basic test program with no other aim than to survive and proliferate. The billboard worm was designed to distribute messages across a network. Other
applications included the alarm clock worm, the multimachine animation utility using worm-like behaviour and the diagnostic worm.[46] What is important is that basic Arpanet network programs contained worm-like routines, making the
distinction between “normal” programs and parasitic routines ambiguous.

Similarly the idea of packet-switching that was pioneered with the Arpanet during the 1970s introduced local intelligence to communications:
instead of being controlled from above from a centralized, hierarchical position, network communications distributed control into small packets which
found their own way from sender to recipient. In a way, such packets included the idea of autonomy and local intelligence of bottom-up systems, while
the network in general was formed into a distributed multiplexing system.[47] Since then, the basic
architecture of the Internet has been based on data that is intelligent in the sense that it contains its own instructions for moving, using networks
to accomplish its operations. In this sense, we can justifiably claim that the origins of worm-like — and partly virus-like — programs lie in the
schematics of network computing in general. The ongoing ambivalence between anomalous and normal functionalities is part of the virus problem even
today as the same program can be defined as a utility program in one context and as a malware program in another, a fact that has not changed during
the history of modern computer software.[48] Similarly, many basic utility programs have for years been
virallike even though often such programs have to have the consent of the user to operate.[49]

Of course, it can be argued that such programs were merely minor experiments and their significance should not be overestimated. However, they
demonstrate several traits of a new paradigm of computing, or science in general. In computer science ideas of distributed programming and, later, of
neural network programming, for example, were gaining ground, becoming part and parcel of the new (non-linear) order of digital culture. This was due
to the growing complexities of the new networks of computation and communication. As computers had — since the 1970s — no longer been seen as
calculation machines, but as “components in complex systems” where systems are built not from top-down but from “subsystems” and “packages”, the basic
idea of a programmer designing algorithms for carrying out a task and achieving a goal had grown old-fashioned. Designing distributed program
environments was seen as one solution.[50]

A genealogical account might argue that this was a follow-up to the problems the military had already encountered. The entire field of cybernetics
and man-machine symbiosis might be seen as part of the complexification of the military command and control structures for which computers provided
the long hoped-for prosthesis to supplement the normal training of generals, admirals and field personnel.[51] In this sense, these network ecologies are not merely complex systems of a self-organizing nature but also designed systems,
which aim to control the complexity and feedback loops of the system. Computer viruses and worms as well as computer culture in general are at least
partially intentionally constructed, yet they cannot be reduced to being a mere human construction. Instead, network ecologies are mixtures of
top-down design and bottom-up self-organization; we have both stable linear structurings and states of complexity that evolve in a dynamic fashion.

So, in addition to military purposes, (artificial) life (or more precisely the science of life, biology) is another historical context to be
accounted for. In addition to distributed programming, techniques of neural network programming were introduced during the latter half of the 1980s.
While these issues had already been discussed years earlier, the real boom came with the newly stated interest in computer programs with a capacity
for learning:

If several different factors have collaborated to this explosion of interest, surely the discovery of algorithms allowing a neural network with hidden
layers to “learn” how to accomplish a given task has had a profound influence in recent developments on neural networks. This influence is so big that
to many newcomers in the field the expression “neural networks” is associated to some sort of “learning” […].[52]

Such thematics of computer science correspond well with the general change of emphasis from top-down understanding of intelligence to bottom-up
distributed systems of learning and adaptation, best illustrated in the Shoch-Hupp worm experiments, and perhaps even by the Creeper virus. The
interest in such evolutionary patterns of viral learning continued all the way to the beginning of the 1990s when a new emphasis took over. The early
1990s also witnessed the first polymorphic viruses that seemed to be able to evolve in response to anti-virus actions.[53] Yet, as for example Fred Cohen accentuated, such live programs were alive only as part of their environment, in other words,
as he had argued ten years earlier, a living system was comprised of living components that could reproduce, while not every component had to be alive
and produce offspring.[54] Viruses as adaptive, self-reproductive and evolutionary programs were thus
at least part of something live, even if not artificial life in the strongest sense of the word.[55]
They were the new “Darwin machines”[56] that formed the ontology of a new digital culture, also
incorporating the essential capitalist digital utopia of Intelligent agents, semi-autonomous programs that ease the pressures put on the (in)dividual
by the increasing information input.[57] Intelligent agents that take care of the ordinary tasks on your
computer or run such errands as reserving tickets, arranging meetings, finding suitable information from the Net and so on are, according to J.
Macgregor Wise, telling of the changes in understanding agency in the age of digital culture,[58] and we
might further emphasize that such programs are actually the culmination of key potentials within the ontology of digital culture. They represent a new
class of actors and functions that roam across technological networks.

One way to grasp this change would be to talk of a Kuhnian paradigm shift in which “life” is no longer restricted to certain carbon-based
organisms. As Manuel DeLanda stated in the early 1990s about artificial life applications in computer science:

The last thirty years have witnessed a similar paradigm shift in scientific research. In particular, a centuries-old devotion to “conservative
systems” (physical systems that, for all purposes, are isolated from their surroundings) is giving way to the realization that most systems in nature
are subject to flows of matter and energy that continuously move through them. This apparently simple paradigm shift is, in turn, allowing us to
discern phenomena that, a few decades ago, were, if they were noticed at all, dismissed as anomalies.[59]

This, too, resonates with the shift of emphasis from top-down artificial intelligence paradigms in computing to seeing connectionism as the
fruitful path to be followed in programming, referred to above. Complexity and connectionism became the key words of digital culture during the 1980s
and ever since. The non-linear processes of thought and computing expressed the “new ideas of nature as a computer and of the computer as part of
nature,”[60] non-reducible to analytic parts but instead functioning as an emergent whole. Concretely
this meant diagrams of digital ecology that depended increasingly on viral computing and semi-autonomous programs. As Tony Sampson describes this new
vision of the digital culture of Universal Viral Machines:

The viral ecosystem is an alternative to Turing-von Neumann capability. Key to this system is a benevolent virus, which epitomises the ethic of open
culture. Drawing upon a biological analogy, benevolent viral computing reproduces in order to accomplish its goals; the computing environment evolving
rather than being ‘designed every step of the way’ […] The viral ecosystem demonstrates how the spread of viruses can purposely evolve through the
computational space using the shared processing power of all host machines. Information enters the host machine via infection and a translator program
alerts the user. The benevolent virus passes through the host machine with any additional modifications made by the infected user.[61]

Thus, no more “Turing’s” and “Von Neumann’s” or any other male designers as demiurges of computer hardware and software, except as forefathers of a
posthumanistic digital culture of viral organisms. Interestingly, such depictions at the beginning of the 1990s of a viral ecology of digital culture
are in accordance with a number of other narratives of posthumanism and the automated media culture of artificial life.[62] The Universal Viral Machine also seems to fulfill Friedrich Kittler’s views of machinic subjectivity in the age of Turing
Machines: for Kittler, machine subjects were born with the realization of conditional jump instructions, known also as the IF/THEN-pairing of program
code.[63] This implies that a program can autonomously change its mode of operation during its course of
action. In Kittler’s schema, when computers have detached their read/write-capabilities from human assistance, the entrance of a new kind of
subjectivity on the level of society is entailed. In this view, Fred Cohen’s ironical notion that the first widely reported virus incident, the
so-called Morris Worm (1988), was in fact “the world’s record for high-speed computation”[64] proves an
apt description of the potentialities of the semi-autonomous computational processes of digital culture, which exclude the human operator from the
circuit. Worms and viruses might, then, also be grasped as posthumanist actors of a kind.

Media Ecology: Life and Territory

Digital culture was occupied with a new breed of vital computer programs in the 1980s and 1990s, even though such programs were merely
actualizations of tendencies and aspirations of computer culture since the Second World War. Seeing these programs and digital network culture as part
of the novel field of artificial life was one key attempt to conceptualize and contextualize them. In addition to being interesting examples of the
capabilities of programming languages and digital network architecture, computer viruses and worms can be seen as indexes or symptoms of a larger
cultural trend that has to do with understanding the life of media and the networked digital media culture through the concept of media ecology.
Specifically, the coupling of nature and biology as part of digital architecture was a central trend since the pioneering work of von Neumann, Wiener
and others. It gives an important clue to the genealogical traits of the modern media condition emphasizing adaptability, automation, complexity, and
bottom-up intelligence, or artificial life. Viruses and worms function as immanent expressions of network culture.

On the other hand, such a conceptual perspective of media as an ecology, as life, or technological dynamism, provides a way of understanding the
complexity, the connectionism and the flexibilities that function at the core of the contemporary media condition. In a way, this also accentuates the
need to ground theories of digital culture in cybernetics (Wiener, von Neumann), and, even more urgently, in second-order cybernetics (Maturana,
Varela, Luhmann, as well as Bateson) which might give an even more subtle and complex understanding of the connectionist technologies of contemporary
culture. Such projects and orientations took their main priority to be in the couplings of systems and environments and the self-organization of
complexity. Hence, approaching the issue of ecology with Gregory Bateson means apprehending ecology as the “study of the interaction and survival of
ideas and programs (i.e. differences, complexes of differences, etc.) in circuits”[65], implying that
prime importance should be given to the coupling of organisms and their environment as the basic unit of evolution.[66]

Ecologies should be understood as self-referential systems, or processes, where in order to understand (or observe) the functioning of the system,
one cannot detach single elements from its synthetic consistency (and label some elements as purely anomalous, for example). Instead, one should focus
on Humberto Maturana’s question: “How does it happen that the organism has the structure that permits it to operate adequately in the medium in which
it exists?”[67] In other words, attention should be on a systems approach that allows one to also think
of digital culture as a series of couplings where “organisms” or “components” participate in the autopoiesis of the general system, which, in our
case, is the digital culture of networking. The autopoietic system is a reproductive system, aiming to maintain its unity in organizational form:

This circular organization constitutes a homeostatic system whose function is to produce and maintain this very same circular organization by
determining that the components that specify it be those whose synthesis or maintenance it secures. Furthermore, this circular organization defines a
living system as a unit of interactions and is essential for its maintenance as a unit; that which is not in it is external to it or does not exist.[68]

From this perspective, computer worms and viruses are not so much anomalous, random or occasional break-ups in a (closed) system that would
otherwise function without friction, as they are, rather contrarily, part of the ecology they are coupled with. Yes, such programs are often sources
of noise and distortion that can turn against the network principles, but more fundamentally they repeat the essentials of network ecology, in effect
reproducing it. This of course refers to the fact that viruses and worms do not have to contain malicious payloads in order to be viruses and worms.
Hence, one should also analyze such entities on the abstract (machinic) level of their ecological coupling to the machinic phylum of networking.

In this sense, the network ecology should be seen as consisting of both actual and virtual parts in order to allow it a certain dynamism and to
short-circuit the often too conservative focus on homeostasis found in some strands of systems theories. Where Maturana and Varela, for example, tend
to emphasize that the circular system of homeostasis is self-enveloping, I would turn to a more Guattarian view where there is always an ongoing
testing and experimenting of the limits of the organization to see what are the potential virtual tendencies of an ecology. [69] In this sense, media ecologies are not mere systems of empty repetition, but affecting and living entities looking for and
testing their borders and thresholds.

Viruses and worms are tendencies within this machinic ecology of digital culture of the last decades. They are part of the machinic phylum of
network culture, which can be understood as the level of potential interactions and connections. It is a plane of virtuality where specific
actualizations, or individuations are able to occur. Thus there is always the perspective of (non-linear) evolution in such a comprehension of
virtuality. The virtual as a plane of potentiality is something not actually existing (although real) for it is in constant process of becoming. Just
as nature cannot be grasped as something “given”, media ecologies should be seen as planes of giving, as iterative reserves. Brian Massumi writes
about nature as virtuality and as a becoming, which “injects potential into habitual contexts”, where “nature is not really the ‘given'”, but in fact
“the giving — of potential.”[70]. As Massumi continues, this is Spinoza’s “naturing nature”, where
nature cannot be reduced to an actual substance, a mere extensive and exhaustible state of being. This stance of active creation can also underscore
the fact that media ecologies cannot be seen as static, hylomorphic structures of autonomous technologies but as active processes of creation, or as a
useful orientation, horizon, with which to think the media condition of digital culture. The future of a media-ecological system is open-ended, making
quite radical changes possible. Hence computer viruses as entropy-resisting instances of life can be seen as part of the autopoietic processes of a
system yet also as potential vectors of becoming, open-ended becomings for novel conceptualizations of network culture.[71]

In short, on the plane of media ecology as a self-referential system it becomes irrelevant to label some elements as “anomalous”, as not part of
the system, for every element is given by the virtual system (which in itself and in its virtuality cannot be taken as a given, as a preformed
platonic idea). Instead, “anomalies”, if defined alternatively, are particular trackings of certain lineages, of potentials on that plane, not
necessarily disruptions of a system. In addition, in accord with the communication theory of Shannon and Weaver recognizing that noise is internal to
any communication system, it can be said that every media-ecological system has its white noise, essential to any functioning system. At times, of
course, the noise may become too great and enact a change to another constellation.[72] Yet,
fundamentally, nature works via parasitism and contagion. Nature is in fact unnatural in its constant machinic adoption.

From the point of view of a plane of immanence, Nature is not constituted around a lack or a transcendental principle of naturalness, instead it
constantly operates as a self-creating process: “That is the only way Nature operates — against itself.”[73] This is also in accordance with the above-mentioned Spinozian understanding of life, which sees it as an affect: as movements,
rests and intensities on a plane of nature (whether media-ecological or other). Nature is thus not merely a particular substance or a form, but a
potential becoming, which connects to Guattari’s project of virtual ecology, ecosophy: “Beyond the relations of actualized forces, virtual ecology
will not simply attempt to preserve the endangered species of cultural life but equally to engender conditions for the creation and development of
unprecedented formations of subjectivity that have never been seen and never felt.”[74] “This
experimental ethos amounts to a project of ecosophy that cultivates “new systems of valorization, a new taste for life.”[75]

A media ecology is not, then, based solely on technical or social elements, for instance, but on the relationships of heterogeneous fields in which
the conjoining rhythm of such an ecology unfolds. [76] As technical quasi-objects (or vectors of
becoming) are relational to their technical environment (in the way that a virus is part of the Turing environment), such technicalities interface
with the so-called-human elements of a system, leading us to realize the multifarious constitution of ecologies made up of social, political,
economical, technical, and incorporeal parts, to name a few.[77]. In addition, as some critics have
underlined, computer worms and viruses are not comparable to biological phenomena because they are merely part of digital code, programmed by humans.
Instead of embracing such a social constructivist perspective we must, rather, see how this shows that people (Kittler’s so-called-human-beings) are
also part of media ecology: humans are part of the machinic composition, which connects and organizes humans and non-humans into functional systems.
In this sense it would be an interesting agenda to analyze how virus-writing practices are related to general vectors of “viral autopoiesis”, of the
symbiotic network ecology. Or, to take another example: how the media technological logic of worms and viruses fits in with the logic of network
organization, collaborative programming and “swarms” as analyzed by Hardt and Negri.[78]

The turbulent network spaces, as Tiziana Terranova refers to them, that support viral software, but also ideas and affects, is hence to be engaged
head on and affirmatively. As Terranova notes, “the Internet is not so much a unified electronic grid as a chaotic informational milieu.” This agrees
well with my point concerning the notion of virtuality in media ecologies: media ecologies are not homeostatic grids or rigid structures, but only
partially stable systems (multiplicities) with the potentiality for open-ended becomings. Discussing biological computing, concerned with the emergent
bottom-up “power of the small”[79], Terranova notes that such systems do not follow any simple
autopoietic movement of mechanical repetition, rather, “they are always becoming something else.” [80]
This “something else”, this becoming at heart of the machinic phylum is what should be incorporated as part of our understanding of media ecologies as
well: We are not dealing with rigid structures or platonic heavenly ideas, but potential tendencies to be cultivated and experimented upon in order to
create alternative futures for digital network culture.

Notes

—————

[1] Humberto R. Maturana and Francisco J. Varela. Autopoiesis and Cognition. The Realization of the
Living
, Dordrecht and London: D. Reidel, 1980, p.6.

[2] Katie Hafner & John Markoff. Cyberpunk: Outlaws and Hackers on the Computer Frontier.
London: Fourth Estate, 1991, p.254.

[3] Douglas Rushkoff. Media Virus!, New York: Ballantine Books, 1996, p.247.

[4] On capitalism and computer viruses, see Jussi Parikka. “Digital Monsters, Binary Aliens – Computer
Viruses, Capitalism and the Flow of Information.” Fibreculture, issue 4, Contagion and Diseases of Information, edited by Andrew Goffey, http://journal.fibreculture.org/issue4/issue4_parikka.html.

[5] Deborah Lupton. “Panic Computing: The Viral Metaphor and Computer Technology.” Cultural
Studies
, vol. 8 (3), October 1994, p. 566.

[6] “Scientists: Virus May Give Link to Life.” SunHerald, May 12, 2004, http://www.sunherald.com/mld/sunherald/news/nation/8649890.htm.

[7] See Bruno Latour. We Have Never Been Modern. New York & London: Harvester Wheatsheaf, 1993

[8] In addition to perspectives articulated by e.g. Friedrich Kittler and Paul Virilio, see e.g.
Stephen Pfohl. “The Cybernetic Delirium of Norbert Wiener.” CTheory 1/30/1997, http://www.ctheory.net/text_file.asp?pick=86. See also Paul E. Edwards. The Closed World.
Computers and the Politics of Discourse in Cold War America
. Cambridge & London: The MIT Press, 1996.

[9] See Pierre Sonigo & Isabelle Stengers. L’Évolution. Les Ulis: EDP Sciences, 2003,
149. The media-ecological approach is usually connected with works by Marshall McLuhan, Neil Postman and the so-called Toronto School. On a critical
evaluation of some media ecological themes, see Ursula K. Heise. “Unnatural Ecologies: The Metaphor of the Environment in Media Theory.”
Configurations, Vol. 10, issue 1, Winter 2002, pp. 149-168. See also Matthew Fuller’s recent book Media Ecologies: Materialist Energies in
Art and Technoculture
. Cambridge, MA: MIT Press, 2005. Fuller discerns three strands of media ecology: 1) the organisational understanding of the
information ecology at work places, etc., 2) the environmentalist media ecologies by e.g. McLuhan, Lewis Mumford, Harold Innis, Walter Ong and Jacques
Ellul, which tend to emphasise homeostasis and equilibrium, and 3) the poststructuralist accounts of media ecology by e.g. N. Katherine Hayles and
Friedrich Kittler, which can be seen as opening up the too humanistic emphasis of the second category. Fuller adds (pp. 3-5.) in also Félix
Guattari’s emphasis on experimentation and probing as a key part of his project, something that I find very valuable orientation as well,
supplementing e.g. Kittler’s perspectives.

[10] Gilles Deleuze. Spinoza: Practical Philosophy. Trans. Robert Hurley. San Francisco: City
Lights Books, 1988, p. 123.

[11] See Eugene Thacker. “Biophilosophy for the 21st Century.” CTheory 9/6/2005, http://www.ctheory.net/articles.aspx?id=472. In addition, I find Alex Galloway’s notions of the
protocological nature of viruses similar to my genealogical point. Viruses act as agents that take advantage of the Net architecture, yet their
vectors exceed the predefined limits. See Alexander Galloway. Protocol. How Control Exists After Decentralization. Cambridge, MA & London: The
MIT Press, 2004, p.186.

[12] Fred Cohen. “Computer Viruses – Theory and Experiments”. DOD/NBS 7th Conference on Computer
Security, originally appearing in IFIP-sec, 1984, Online: http://www.all.net/books/virus/index.html.

[13] Thierry Breton & Denis Beneich. Softwar. Paris: Robert Laffont 1985.

[14] Cohen, “Computer Viruses – Theory and Experiments”.

[15] Rudy Rucker, R.U. Sirius & Queen Mu (eds.). Mondo 2000. A User’s Guide to the New Edge.
London: Thames & Hudson, 1993, p.276. The year 1984 Pentagon-report “Strategic Computing”, aimed to bridge the “software gap” with Japan, was grounded
on the ideas of autonomous predatory machines and visions of electronic software battlefields of the 1990s. Manuel DeLanda. War in the Age of
Intelligent Machines
. New York: Zone Books, 1991, pp.169-170.

[16] See Tony Sampson. “A Virus in Info-Space.” M/C: A Journal of Media and Culture, 2004, http://journal.media-culture.org.au/0406/07_Sampson.php. Cohen’s work was often
neglected as not addressing a real threat. Several commentators were very skeptical about the possibility of a wide scale spread of such programs.
Others regarded Cohen’s tests as dangerous, in the sense that publishing the work would spread the knowledge needed to create viruses.

[17] Fred Cohen. “Computer Viruses.” Dissertation presented at the University of Southern California,
December 1986, p. 12.

[18] Cohen, “Computer Viruses,” p. 25.

[19] This meant especially addressing the problems of transitivity, the flow of information, and in
ge

Sleeping Beauty a health hazard

Ear plugs anyone? … Jane Rosenson, of the Australian Opera and Ballet Orchestra, rehearses in the Opera House pit.
Photo: Andrew Meares

By Harriet Alexander
November 19, 2005

The Australian Opera and Ballet Orchestra knows the din in the Opera House pit is enough to wake, well, Sleeping Beauty.

But its players never dreamed it could be slowly sending them deaf.

The classic fairytale is famed as one of the loudest ballets, and now on the eve of a new performance at the Opera House, WorkCover says the expected noise levels in the pit will breach its regulations.

To stick within the law, some musicians will have to wear ear plugs and some will have to be replaced at the interval, costing Australian Ballet up to $100,000 extra for the production. WorkCover told the orchestra’s managers, Opera Australia, the musicians could not be subjected to noise greater than 85 decibels averaged over a working day, which it believes Sleeping Beauty will exceed.

Opera Australia has agreed to shorten the length of time some musicians spend in the pit and hire casuals – at an average cost of $140 a shift – to fill the 750 extra spots it now needs to cover.

Australian Ballet has to foot the bill, but the upheaval has also given Opera Australia a headache. Finding replacement musicians two weeks before Sleeping Beauty opens on December 2 is proving difficult. Vernon Winley, Opera Australia’s human resources director and executive in charge of the AOBO, said Opera Australia would have to look at bringing musicians in from other states. “We’re having to look very hard,” he said.

The AOBO has rostered musicians to swap at intervals before, but only for long operas.

Australian Ballet’s executive director, Richard Evans, expects the WorkCover rules will cost its productions $200,000 next year. “So it’s urgent that this issue is resolved,” he said. “From a cost perspective it’s really escalating because we’re not just [paying] one orchestra.”

Mr Evans said a federal inquiry into orchestra pit funding was due at the end of the month.

The orchestra pit at the Opera House is notorious for its poor acoustics, but there is still no timetable for its refurbishment.

Musicians complain of damage to their hearing, and at least three have refused to play in Sleeping Beauty.

Those that remain are worried the quality of the music will suffer and are considering writing a disclaimer for the audience.

“It’s going to be really awful,” said Will Farmer, a trombonist. “Nobody knows, for example, how it’s going to sound when the whole brass section have got their ear plugs [in].

“We’re not about to go on strike over it … but I think it places a lot of pressure on the Government to do something about getting it fixed.”

Mr Winley said Opera Australia regularly tested noise levels in different sections of the pit. Levels from the brass section during performances of Swan Lake reached 95 decibels at times.

Meditation builds up the brain

11:01 15 November 2005
NewScientist.com news service
Alison Motluk

Meditating does more than just feel good and calm you down, it makes you perform better – and alters the structure of your brain, researchers have found.

People who meditate say the practice restores their energy, and some claim they need less sleep as a result. Many studies have reported that the brain works differently during meditation – brainwave patterns change and neuronal firing patterns synchronise. But whether meditation actually brings any of the restorative benefits of sleep has remained largely unexplored.

So Bruce O’Hara and colleagues at the University of Kentucky in Lexington, US, decided to investigate. They used a well-established “psychomotor vigilance task”, which has long been used to quantify the effects of sleepiness on mental acuity. The test involves staring at an LCD screen and pressing a button as soon as an image pops up. Typically, people take 200 to 300 milliseconds to respond, but sleep-deprived people take much longer, and sometimes miss the stimulus altogether.

Ten volunteers were tested before and after 40 minutes of either sleep, meditation, reading or light conversation, with all subjects trying all conditions. The 40-minute nap was known to improve performance (after an hour or so to recover from grogginess). But what astonished the researchers was that meditation was the only intervention that immediately led to superior performance, despite none of the volunteers being experienced at meditation.

“Every single subject showed improvement,” says O’Hara. The improvement was even more dramatic after a night without sleep. But, he admits: “Why it improves performance, we do not know.” The team is now studying experienced meditators, who spend several hours each day in practice.

Brain builder
What effect meditating has on the structure of the brain has also been a matter of some debate. Now Sara Lazar at the Massachusetts General Hospital in Boston, US, and colleagues have used MRI to compare 15 meditators, with experience ranging from 1 to 30 years, and 15 non-meditators.

They found that meditating actually increases the thickness of the cortex in areas involved in attention and sensory processing, such as the prefrontal cortex and the right anterior insula.

“You are exercising it while you meditate, and it gets bigger,” she says. The finding is in line with studies showing that accomplished musicians, athletes and linguists all have thickening in relevant areas of the cortex. It is further evidence, says Lazar, that yogis “aren’t just sitting there doing nothing”.

The growth of the cortex is not due to the growth of new neurons, she points out, but results from wider blood vessels, more supporting structures such as glia and astrocytes, and increased branching and connections.

The new studies were presented at the Society for Neuroscience annual meeting, in Washington DC, US.